Quantcast
Channel: Terence Luk
Viewing all 836 articles
Browse latest View live

Infrastructure as Code in 15 Minutes PowerPoint Presentation

$
0
0

It’s finally April and this month is typically when I perform a bit of spring cleaning on my laptop to avoid having files and folders become too disorganized. One of the files I came across as I sorted away my documents folder is a PowerPoint I created a while back when I interviewed for a role where I was asked to create a presentation on a topic of my choice and present it to 5 interviewers. Rather than choosing a topic I was extremely fluent with, I decided to try my luck with something I was learning at the time and that was Infrastructure as Code with Terraform. I did not want the presentation to be too focused on a specific vendor so I spent most of the time talking about the benefits of IaC, then using Terraform as a solution. The window I had to work with was 30 minutes so I kept the presentation short to leave some time at the end for questions. The feedback I received was very positive as 3 of the 5 interviewers expressed how much they liked my presentation. Given that this presentation was created with my own personal time, I think it would be great to share it out in case anyone is looking for material to introduce an audience to IaC. This specific role I interviewed for had a special place in my heart because of the interviewers I had the opportunity to meet and how supportive everyone of them were. The marathon of interviews were long but extremely gratifying and I enjoyed the experience even though I wasn’t selected in the end.

Without further without further ado, the PowerPoint presentation can be downloaded here: https://docs.google.com/presentation/d/1v8X1e9RimDdkpiR01Rj5et_Mnip5n0u0/edit?usp=sharing&ouid=111702981669472586918&rtpof=true&sd=true

I will also paste the content of the presentation below along with the notes I used during the presentation. Enjoy!

image

Intro

Good afternoon everyone and thank you for attending this presentation. The topic I will be presenting is Infrastructure as Code in 15 minutes.

image

Agenda

The agenda today will begin with a look at how we traditionally deploy infrastructure, followed by What is Infrastructure as Code, also known as IaC. Then the benefits of IaC, what is imperative vs declarative, IaC with Terraform, IaC in DevOps Pipelines, a sample setup and finally Q&A.

image

Traditional infrastructure deployment

The tools for infrastructure deployment has traditionally been through the use of a graphical user interface and scripts. As user friendly GUIs are, the obvious challenges is that it is very much a manual and time-consuming process and prone to the errors that the administrators performing the configuration can make. Attempting to maintain consistency is very difficult thus leading to configuration drift and trying to keep multiple environments that are meant to mirror one another in lockstep is challenging. Trying to scale the environment is cumbersome (e.g. deploy more instances of VMs or add new subnets). Lastly, there isn’t an easy way to easily document the environment other than screenshots and spreadsheets containing configuration values.

Scripting adds a bit of automation but often difficult to maintain through time.

image

What is Infrastructure as Code?

Infrastructure as Code is the essentially managing and provisioning infrastructure through code. Leveraging code means that we can now introduce automation of the management of the infrastructure whether it is creating new resources or making modifications to them. Infrastructure of Code can be implemented as imperative or declarative, which is an important topic we will cover shortly.

image

Benefits of IaC

To further elaborate on the benefits of IaC, it is now possible to not only automate the deployment in one cloud but across multiple clouds such as GCP, Azure and AWS. The speed and efficiency of deployment can be greatly increased as the process eliminates the manual points and clicks of the administrator, the process is also repeatable and consistent allowing multiple environments to be deployed in lockstep. The code can easily be source controlled with versioning which will give way to team collaboration through Continuous Integration. CI/CD pipelines can be used to develop and deploy the infrastructure leveraging all the benefits of DevOps. Infrastructure management can simplified and standardized through policies and scale at ease – so think about tweaking a variable to scale from 1 to 100 rather than going into a GUI and deploying or cloning resources multiple times. Static application security testing, which is the process of reviewing source code and detecting vulnerabilities can now be performed rather than trying to comb through the deployment configuration documentation or GUI post deployment of the infrastructure. Manual labour is significantly reduced.

image

Imperative vs Declarative

One of the important aspects of IaC is the concept of imperative vs declarative. To put it in simple terms, let’s consider the end state or goal we want to achieve is to get to a pizza restaurant. Imperative can be described as “what to do” while declarative is “what is wanted.” So let’s say we hop into a taxi and want to get to this end state. An example of imperative instructions would be to tell the taxi driver to go:

  • Forward 1 mile
  • Turn right
  • Forward 2 miles
  • Turn left
  • Forward 3 miles
  • Arrive at pizza restaurant

While declarative is:

  • Go to the pizza restaurant.

image

Let’s dissect the differences and outline them.

With imperative, the starting point matters because we are explicitly calling out each action to get to the end state. This leads to difficulty in auditing the steps and trying to detect drift when changes are made. Version control is challenging if even possible, if the steps execute half way and stop due to error, you cannot repeat the steps without ending in a completely different state. The logic can get very complex as ordering matters and trying to change the destination state requires modifications to the steps.

Declarative, on the other hand, allows the starting point to be anywhere because the engine delivering or carrying you to the end state will handle the directions. Declarative is idempotent in nature so you can run it as many times as you want without affecting the state. The code can also be repeatedly ran in a pipeline to create multiple environments in lockstep. Having removed the detailed imperative steps, we can easily validate and detect any drift and introduce version control. Lastly, we can change the destination without worrying about changing the steps.

image

IaC with Terraform

One of the popular IaC solutions currently on the market is Terraform. It is written and compiled in Go and is a declarative language known as HashiCorp Configuration Language (HCL) and has multi-cloud support. The way it handles deployments to multiple clouds is through the use of providers and there are approximately 1521 providers currently available on their site. Terraform is written in plain text and can be source controlled with Git or Terraform cloud. Security can be introduced through RBAC so multiple workspaces for different teams managing different environments or components of it can only make changes to their environments. Lastly, policies with automation can be introduced to provide control and governance.

image

IaC with DevOps Pipelines

What IaC enables, which I feel is the most powerful aspect, is the use of pipelines. With IaC we can now leverage the DevOps methodology with CI/CD pipelines to deploy infrastructure. Pipelines can be created to only deploy infrastructure or can incorporate the deployment of infrastructure as a part of an application., which means the IaC is only a small component of the pipeline. The flow diagram shown here is a simplified version depicting of the process as we can integrate many different stages into the pipelines such as security scans and testing. This unlocks true automation and different release strategies.

image

Sample Setup

To demonstrate how we can fully automate the deployment of cloud resources, I have prepared a simple sample configuration where I will go through the setup in the following slides.

image

Prerequisites

We will assume that Jenkins along with the Terraform plugin is deployed, a GitHub repot with terraform deployment code is created and a service principal (in this case Azure) will be setup for Jenkins so it can deploy resources. So as show in the screenshots, we’ll have Jenkins, the Terraform plugin installed, the GitHub repo where the Terraform code is pulled and finally the service principal created in Azure.

image

Create Jenkins Pipeline

First, we’ll write the following Jenkins pipeline 4 stages for the infrastructure deployment. The first stage is named:

  • Checkout, which will checkout the code in the GitHub repo
  • The second will be to initialize Terraform downloading the required provider
  • Then Terraform plan will be executed so Terraform can perform a dry run which typically outputs to the console for the changes
  • Then Terraform apply or destroy will be executed to either deploy or remove the infrastructure

image

Parameterize the Jenkins pipeline

This simple setup will require administration intervention by choosing either to apply or destroy so we’ll configure a choice parameter for the pipeline. Note that we can also use triggers to automatically initiate the build through commits.

image

With the execution parameters setup, we will proceed to paste the code into the pipeline.

image

Build Pipeline

Then finally with the pipeline configured, we’ll initiate the pipeline build interactively by choosing apply, then we can view the progress as shown in the screenshot above. Once the build is complete, we should see the resources in Azure.

This short demonstration only scratches the surface of what are the limitless possibilities of IaC with pipelines. Other examples could be that a pipeline deploys an application which will include the infrastructure build as a step for the target infrastructure.

image

Ending

This concludes my IaC in 15 minutes presentation. Thank you for attending and feel free to ask any questions or provide any comment.


PowerShell script for exporting Microsoft Teams user configuration to an Excel and importing user configuration with updated Excel file

$
0
0

I used to work with Microsoft Teams Direct Routing voice deployments quite often in the past before moving to a more Azure focused role so one of my ex-colleague recently reached out to me to ask if I had a script where we could bulk configure settings for user accounts to enable them for Enterprise Voice. There was a deployment a I’ve worked on before where there were several hundred users and all of them had already been configured for Enterprise Voice but had extensions that were not DIDs because they still had their existing PBX forward inbound calls to the SBC and then to Teams. On the day of the cutover when we move the SIP trunks to the SBC, we had to bulk update their LineURI field. The approach I took was twofold:

  1. Write a script that exported all of the users’ Teams configuration to an Excel file
  2. Write a script that imported user settings from the same spreadsheet after it was updated with the full DID extensions

The following is a sample of the export:

image

This spreadsheet was then updated to have the appropriate LineUri, which will then be used with an import script to update the settings. In addition to updating the LineUri attribute, the script will also enable Enterprise Voice, configure the dial plan and voice routing policy.

I don’t have much use for the scripts anymore given that I don’t work in the Teams space but I wanted to share them in case anyone may be looking for this.

The export script can be found here at my Github: https://github.com/terenceluk/Microsoft-365/blob/main/Teams/Export-TeamsUserConfig.ps1

The import script can be found here at my Github: https://github.com/terenceluk/Microsoft-365/blob/main/Teams/Import-TeamsUserConfig.ps1

Hope this helps!

PowerShell scripts for exporting an Azure subscription's Azure SQL Databases to an Excel file and using the updated Excel file to export/backup the databases

$
0
0

I’ve recently been involved in a manual migration of multiple subscriptions from a tenant to another due to an organization change and one of the components that I was responsible for was the migration of the Azure SQL Databases. I had originally hoped that I would be able to use the DMA (Data Migration Assistant) but attempting to select an Azure SQL Database as a source would throw an error indicating it was not supported. Given that there weren’t too many databases and there were only 2 that would require an outage, we decided that we’ll perform an export/backup of the databases to a storage account and then import/restore them in the destination subscription.

While it is possible to manually export them via the GUI:

image

A more efficient way was to use PowerShell to export all of the subscription’s databases and its properties into an Excel file, update the Excel file with the SQL credentials, then use PowerShell to read through the Excel spreadsheet to export/backup the databases to a Storage account.

The PowerShell script I created that will export all of a subscription’s Azure SQL Database properties can be found here: https://github.com/terenceluk/Azure/blob/main/PowerShell/Export-All-Subscriptions-AzureSQLDatabases-To-Excel.ps1

The following screenshot is an example of the export:

image

Assuming that each database has different credentials, add the additional columns to store the SQL Authentication username and password:

  • Username
  • Password
image

With the spreadsheet updated, we can now use this PowerShell script export/backup all of a subscription’s Azure SQL Database to a storage account container: https://github.com/terenceluk/Azure/blob/main/PowerShell/Backup-AzureSQLDatabases.ps1

Hope this helps anyone who might be looking for a way to automate the process of exporting a subscription’s Azure SQL Database to Excel and then using the list to backup the databases.

Let's learn IaC using Terraform with GitHub Actions to deploy into Microsoft Azure

$
0
0

It has been a busy start to the year and I regret that I haven’t been able to dedicate more time to blogging so I intend on catching up on my backlog in the following months. One of the topics I’ve been meaning to write about is how to use Terraform with GitHub actions to deploy infrastructure in Azure. Both Terraform (IaC) and GitHub Actions (orchestration engine) are technologies I’ve been self-training over the past few months and I am excited to continue building the knowledge I’ve acquired. Those who may not be familiar with IaC can refer to one of my previous posts here:

Infrastructure as Code in 15 Minutes PowerPoint Presentation
http://terenceluk.blogspot.com/2022/04/infrastructure-as-code-in-15-minutes.html

My journey through learning these two technologies has been challenging but very fulfilling, and the purpose of this post is to share the various features I came across and what I’ve been able put together to demo all of them. I have to admit that I am not an expert and there may be better approaches so please feel free to comment on this post.

A few of my colleagues have indicated that they feel it would be beneficial to include more diagrams in my posts rather than just writing so I have taken the time to create a series of diagrams to better illustrate the workflow and the Terraform code that used.

What I’m trying to Demo

The components and features I’m trying to demo are the following:

  1. How to use Terraform for IaC
  2. How to execute terraform init, format, validate, plan, apply, destroy in a workflow
  3. How to use different .tfvars variables to deploy different environments: dev, stg, prod
  4. How to use GitHub Actions as an orchestration engine for pipelines
  5. How to initiate a workflow manually, on push, on pull request, on complete of another workflow
  6. How to have the workflow store the terrafirn .tfstate file to an Azure Storage Account Container
  7. How to set a dependency on a previous step
  8. How to use and reference custom self-written Terraform modules stored in a different registry and in subfolders (some may not know about the // when accessing a module in a sub directory)
  9. How to use and reference a Terraform Registry module
  10. How to use the Super-Linter to scan the code
  11. How to get a branch name
  12. How to pass a branch name a different step
  13. How to store and use secrets in GitHub
  14. How to configure different environments in GitHub
  15. How to configure a protection rule for a GitHub environment
  16. How to require an approval before executing a workflow
  17. How to configure an Azure Storage Account Container to store the .tfstate file

GitHub Repositories

Let me begin by providing the links to the GitHub repositories I will be using for the demonstration.

GitHub Repository that contains the GitHub Actions workflows, Terraform code for deploying dev, stg, and production environments:

https://github.com/terenceluk/terraform-k8s-acr-psql-vms-demo

GitHub Repository that contains the Terraform modules that are referenced and used for the deployment of Azure Kubernetes Service, Azure Container Registry, and PostgreSQL server and database:

https://github.com/terenceluk/terraform-modules

I’ve added as many comments into the code to explain the purpose in hopes that whoever is reading it will understand the function. Feel free to fork them to your repo and test or modify as you see fit.

GitHub Repository Branches

There will be 3 branches in the GitHub repo:

  • Dev
  • Stg
  • Prod
image

The Terraform code and workflows will be directly pushed to the dev branch to test, then merged into stg and production.

Terraform and GitHub Actions Code

The GitHub Actions YAML files will be stored in the mandatory .github/workflows directory of the repository.

The Terraform code will be split as follows.

terraform-k8s-acr-psql-vms-demo repository

  • The main.tf, output.tf, provider.tf, and variables.tf files are stored in the root
  • The .tfvars files containing the variable values for dev, stg, and prod environments are stored in the subfolder Variables
  • The main.tf references modules that are stored outside of its repository:
    • Another GitHub public repository named terraform-modules
    • A Terraform Registry module

terraform-modules repository

  • This repository contains 3 modules that are used to deploy:
    • Azure Container Registry
    • Azure Kubernetes Service
    • PostgreSQL Server and Database
image

What we are deploying with Terraform

The resources that will be deployed are the following:

  1. Azure Container Registry
  2. Azure Kubernetes Service
  3. PostgreSQL Server and Database
  4. Linux Virtual Machine
  5. Windows Virtual Machine
  6. VNet with subnets
  7. Management lock for the Azure Container Registry

Workflow: terraform-plan.yml and terraform-apply-dev.yml

The terraform-plan.yml and terraform-apply-dev.yml workflows are dispatched whenever there is a push to the dev branch in the GitHub repo. I have included a diagram below that walks through the process and will also list the flow here:

  1. User updates Terraform or GitHub Action YAML code and pushes it to the dev branch of the GitHub repo
  2. The terraform-plan.yml workflow is started as it is configured to start on push to dev
  3. Two steps are now executed in parallel:
    1. Get-Branch-Name to determine what branch this was pushed on
    2. The download and use of the Super-Linter is ran in parallel to scan the code
  4. Several Terraform commands are executed:
    1. fmt is ran on the Terraform code to ensure it is formatted properly
    2. validate is ran on the Terraform code
    3. init is ran to initialize and configure an Azure Storage Account Container to store the .tfstate file
    4. plan is ran to generate a plan with the appropriate terraform-dev.tfvars file
  5. Once the terraform-plan.yml workflow is complete, initiate the terraform-apply-dev.yml workflow
  6. Get-Branch-Name will start to obtain the previous workflow run conclusion
  7. If previous workflow was not successful then end the workflow, if it was successful then get the branch path that is currently being worked on
  8. If the branch path is not dev then end the workflow, if it is the dev branch then set the environment to GitHub dev-deploy and go to the next step
  9. The dev-deploy GitHub environment has a protection rule configured that requires review and approval so the reviewer will receive an email to approve or reject
  10. If the reviewer has approved then proceed with the deploy where the following are executed:
    1. fmt is ran on the Terraform code to ensure it is formatted properly
    2. validate is ran on the Terraform code
    3. init is ran to initialize and configure an Azure Storage Account Container to store the .tfstate file
    4. plan is ran to generate a plan with the appropriate terraform-dev.tfvars file and the -out switch is used to create the plan.tfdata file
    5. apply with -auto-approve using the plan.tfdata file is executed
  11. Resources will not get deployed to Azure
image

The following is a screenshot of the jobs in the workflows and the process during the deployment:

image

What a pending review looks like:

image

The email a reviewer would receive:

image

The review prompt in GitHub:

image

The apply output when deploying infrastructure:

image

A successfully deployment (note that the duration of 20h 13m 31s is because I left the review pending over a day):

image

Workflow: terraform-apply.yml

The terraform-apply.yml workflow is executed upon completing a pull request for stg and prod. It is much less complex so I will simply include the two diagrams to describe the process:

Staging:

image

Production:

image

Workflow: terraform-destroy.yml

The terraform-destroy.yml workflow is dispatched manually when we want to remove the environment. The following are a few screenshots of manually dispatching the workflow:

image

The output during a destroy of the infrastructure:

image

Successfully destroying the infrastructure:

image

Setting up Azure Storage Account Container and Resource Group

With the walkthrough of the Terraform and GitHub Actions completed, I would like to provide the steps required to set up the Azure Storage Account Container that will be used to store the terraform .tfstate file as none of this would work without it.

We’ll be using the Azure CLI to configure this:

# Log into Azure

az login

image

image

# Define variables for subscription ID, Azure Region, storage account, container

subscriptionID = "xxxxxxxx-71c2-40f2-b3d4-xxxxxxxxxx"

resourceGroup = "ca-cn-dev-demo-rg"

azureRegion = "canadacentral"

storageAccount = "cacndevdemost"

containerName = "terraform-state"

image

# List available subscriptions

az account list

image

# Specify the subscription to use

az account set -s $subscriptionID

# Create a App Registration and corresponding Enterprise Application / Service Principal and assign it contributor role to the subscription – Ref: https://docs.microsoft.com/en-us/cli/azure/ad/sp?view=azure-cli-latest

az ad sp create-for-rbac --name $servicePrincipalName --role Contributor --scopes /subscriptions/$subscriptionID --sdk-auth

Copy the clientId, clientSecret, tenantId values.

image

Note that the following App Registration will be configured along with a secret in Azure AD:

image

The corresponding Enterprise application (Service Principal) will be created:

image

We’ll need to grant permissions to the Service Principal to the subscription that Terraform will deploy resources to. Contributor typically sufficient but there are some configurations such as Resource Locks that require Owner:

image

# Create resource group that will store storage account for saving the Terraform State

az group create -g $resourceGroup -l $azureRegion

image

# Create a new storage account and place it in the resource group

az storage account create -n $storageAccount -g $resourceGroup -l $azureRegion --sku Standard_LRS

image

The following storage account will be created:

image

# Create a container in the storage account to store the terraform state

az storage container create -n $containerName --account-name $storageAccount

image

The following Container will be created and when used a .tfstate will be stored here:

image

Setting up GitHub Secrets

Various parameters such as service principal attributes, secrets, storage account access keys should not be stored directly in the Terraform .tfvars files and should be stored in the GitHub secrets vault for retrieval.

Proceed to navigate to the previously configured Storage Account’s Access Keys and copy the key1 as we’ll need to configure it in GitHub secrets:

image

For the purpose of this example, the dev environment will require the following secrets configured as they are referenced in the workflows and terraform code:

  • DEV_ARM_CLIENT_ID
  • DEV_ARM_CLIENT_SECRET
  • DEV_ARM_SUBSCRIPTION_ID
  • DEV_ARM_TENANT_ID
  • DEV_PSQL_ADMINISTRATOR_LOGIN_PASSWORD
  • DEV_PSQL_ADMIN_LOGIN
  • DEV_STORAGE_ACCESS_KEY
  • DEV_STORAGE_ACCOUNT_NAME
  • DEV_STORAGE_CONTAINER_NAME

Note that you will not be able to view the values of these secrets once they are configured in GitHub.

In addition to the DEV secrets, the STG and PROD secrets will also need to be configured for the other branches.

image

Setting up GitHub Environments

The last requirement for this demo is to set up the different environments in GitHub for the branches. It’s important to note that Environments are NOT available in private repositories for free accounts so you’ll need to use a public repo for it. This demo has the following environments configured:

  • dev-deploy
  • prod
  • dev
  • stg

The additional dev-deploy environment is really just a way for me to execute the plan step to verify the code is free of errors and then requiring a review and approve to initiate the deployment of the resources. This method likely isn’t best practice but I thought I’d use this to demonstrate how to set the environment in the workflow to force a review or reject.

image

With the environments setup, the dev-deploy is configured with the Required reviewers protection rule:

image

… and that’s it. I hope this was beneficial for anyone who may be trying to learn Terraform and GitHub actions. There are plenty of blog posts available but I’ve noticed that some were not very clear on the steps and I’ve spent countless hours troubleshooting the code from start to finish. The process can be very frustrating at times but it’s also very satisfying when everything starts to work.

I’ll be working on another new project to incorporate an actual application in the future and will be sharing it.

A review of CSP Programs Users, Roles, Groups and how they relate to Azure AD and customers CSP subscriptions

$
0
0

One of the items I had on my to-do list was to create material that I could use to walk my colleagues through how our CSP tenant relates to our customers’ tenants and one of examples I wanted to include in the material was how to grant our CSP foreign principal permissions to a customer’s subscription as describe in my previous post:

Granting a CSP Foreign Principal the Reader or Owner role onto an Azure Subscription with PowerShell
http://terenceluk.blogspot.com/2021/09/granting-csp-foreign-principal-reader.html

What I quickly noticed while testing the script was that it no longer worked today (May 2022) because the DisplayName for the CSP foreign principal provided by the output was now blank. What this means is that my script, which looks for a entries where the DisplayName matches Foreign Principal*, will now return zero records:

image

*Note that the warning (https://docs.microsoft.com/en-us/powershell/azure/troubleshooting?view=azps-7.5.0#get-azadgroupmember-doesnt-return-service-principals) is referring to a problem where service principals are not returned by Get-AzAdGroupMember and does not appear to affect the role assignments we’re looking for.

If I remove the filter, it will return the CSP foreign principal with a blank DisplayName as shown in the following screenshot:

image

For reference, here is a screenshot of the foreign principal as displayed as a Azure RBAC role in the Access control (IAM) blade:

image

Here is a screenshot in the Microsoft 365 admin center portal of the foreign principal:

image

I’ve checked three tenants to confirm this is the same across them but since I’m not sure if this is temporary, I will leave the previous post as is with the script unchanged and will provide the updated set of cmdlets in this post that will be focused on discussing the roles from the CSP tenant, how it maps to the tenant’s Azure AD, and how they are used to grant permissions to a customer’s tenant.

Before I begin, the following post provides great information about the CSP identity and rights management even though it is very old: https://docs.microsoft.com/en-us/archive/blogs/hybridcloudbp/identity-and-rights-management-in-csp-model. I highly encourage anyone learning about the CSP program to go through the blog entry.

The Microsoft Partner Center Portal

Those who have worked at a Microsoft partner would be familiar with the partner portal located at: https://partner.microsoft.com where they can sign in by clicking on the PartnerCenter link:

image

Microsoft’s CSP program currently supports three main types of transactional relationships:

  • Indirect providers
  • Indirect resellers
  • Direct-bill partners

More information can be found at the following Microsoft documentation: https://docs.microsoft.com/en-us/partner-center/csp-supported-partner-relationships#types-of-partner-relationships-in-the-csp-program

Depending on the type of relationship you’ll be presented with different navigation menus depending on the type of partner and membership (left is the Direct-bill while the right is a Indirect reseller):

image

For the purpose of this post, we will focus on the Direct-bill partners (Tier 1) who are able to directly provision Azure CSP subscriptions to their customers and are required to use their identity (CSP Provider) to open tickets because customers would no longer be able to from within portal.azure.com.

CSP Program User, Roles, Groups and how they relate to Azure AD

The following is a diagram I mapped out of how the CSP program users and roles relate to the Azure AD:

image

Let’s break down the diagram by mapping the various components out in the Partner Center.

Navigating in the portal https://partner.microsoft.com to User Management, we are able to create accounts and assign predefined roles:

image

These accounts in the Microsoft Partner Center are user accounts in the CSP Azure AD tenant:

image

The type of roles and groups we are able to assign accounts are listed under the drop down list Manages your organization’s account as, while the groups we can add the accounts to are listed under Assist your customer as:

image

The Business Profile admin and Manages your organization’s referrals provide these roles:

image

Detailed information about these roles can be found in the following Microsoft documentation:

Azure AD tenant roles and non-Azure AD roles
https://docs.microsoft.com/en-us/partner-center/permissions-overview#azure-ad-tenant-roles-and-non-azure-ad-roles

To summarize, some of these roles and groups are mapped to the Azure AD tenant while the others are not.

Azure AD Tenant Roles

The following is a mapping between the roles listed under: Manages your organization’s account as and the Azure AD tenant roles:

Microsoft Partner Central Role

Azure AD Role

Global admin

Global administrator

Billing admin

Billing administrator

User management admin

User administrator

 

image

Assigning a user in the Microsoft Partner Center the role of a Globaladmin will place this identity that lives in Azure AD into the Globaladministrator role. This is the same for Billing admin> Billing administrator and User management admin> User administrator.

image

Azure AD Tenant Groups

The roles that are provided under Assist your customer as are mapped as these Azure AD groups:

Microsoft Partner Central Role

Azure AD Groups

Admin Agent

AdminAgents

Sales Agent

SalesAgents

Helpdesk Agents

HelpdeskAgents

 

image

Assigning a user in the Microsoft Partner Center the role of a Admin Agent will place this identity that lives in Azure AD into the AdminAgents Azure AD group. This is the same for Sales Agents> SalesAgents and Helpdesk Agents> HelpdeskAgents.

image

Non-Azure AD Tenant Roles

The remaining list of roles are non-Azure AD tenant roles:

  • business profile admin
  • referral admin
  • incentive admin
  • incentive user
  • MPN (Microsoft Partner Network) partner admin

More information about these non-Azure AD tenant roles: https://docs.microsoft.com/en-us/partner-center/permissions-overview#manage-mpn-membership-and-your-company

image

CSP Admin Agent, Sales Agent and Helpdesk Agent Azure AD Groups to Customer Subscription Azure RBAC Roles Mappings

One of the most important mappings that should be understood is how the CSP Azure AD groups are mapped to the customers’ subscriptions as Foreign Principal Azure RBAC roles. As described earlier, the following two groups assigned within Partner Center:

  1. Admin Agent
  2. Sales Agent
  3. Helpdesk Agent
image

… are mapped to the CSP tenant’s Azure AD group:

  1. AdminAgent
  2. SalesAgent
  3. HelpDeskAgent
image

These CSP tenant Azure AD groups can then be granted Azure RBAC roles to the customer’s subscriptions as foreign identities as shown in the screenshot below:

  1. TenantAdmins
  2. SalesAdmins
  3. HelpdeskAdmins
image

Assigning the CSP tenant’s Azure AD Groups to customers’ subscriptions can only be performed through Azure CLI or PowerShell and cannot be performed through the GUI. To demonstrate this process, I will use the scenario for granting Helpdesk Admin role permissions to open tickets.

The diagram and the beginning of this walkthrough outlines how the foreign principal mapping are assigned and note that the foreign principals in the diagram can be granted any Azure RBAC rules on the subscription:

image

Granting CSP Provider Accounts Permissions to Open Tickets

Let’s take the scenario where a Microsoft CSP partner wants to set up an a group of support representatives who simply opens up an Azure support ticket with Microsoft when requested by the customer. These representatives do not need elevated permissions such as Owner, which is automatically granted to the Admin Agents role when a CSP subscription is created, on the subscriptions as they should not have the ability to perform any changes to the subscription and resources. For this scenario, we use the Helpdesk Agent role that is mapped to the HelpdeskAgent Azure AD group, to assign the Support Request Contributor Azure RBAC role onto the customer’s subscription. The following diagram depicts the assignments and how the identities are mapped:

image

As mentioned earlier, it is not possible to simply sign in as a Owner on the desired customer CSP subscription, navigate to the Access control (IAM) blade, then assign the foreign principal as a Support Request Contributor as shown in the screenshot below because Foreign Principals are not presented from within the portal.azure.com GUI:

image

image

The following are instructions on how to assign the HelpdeskAgents Azure AD group in the CSP Azure AD tenant onto a customer’s subscription as a Azure RBAC Support Request Contributor.

Begin by obtaining the object ID of the HelpdeskAgents Azure AD group in the CSP Azure AD tenant by either navigating to the groups blade:

image

Or alternatively, use the PowerShell cmdlet Get-AzADGroup to list the Object ID:

Connect-AzAccount ### Log in with CSP partner credentials
Get-AzADGroup | Select-Object DisplayName,Id

image

With the HelpdeskAgents Azure AD group ObjectId, proceed to use the following PowerShell New-AzRoleAssignment cmdlet to assign the Azure RBAC role to the desired subscription:

Connect-AzAccount ### Log in with customer global admin credentials

image

With the above cmdlet executed, the HelpdeskAdmin foreign principal identity will be displayed as a Support Request Contributor on the subscription and users in this group able to open support tickets:

image

Note that you do not need to grant additional permissions such as reader to the foreign principal if the users only need to open tickets. It is a common misconception that the users need Reader access to the subscription because prior to graning the Support Request Contributor role, trying to open a new service request will display the following:

image

Restricted Tenant

You do not have access to any subscriptions or resources in this tenant. Click ‘I acknowledge’ to continue or ‘Sign out’ to sign out of this tenant.

image

It is not necessary to grant Reader and Support Request Contributor to the subscription.

Granting CSP Provider Accounts Permissions to Read Subscription

Another potential scenario is if the CSP partner would like to provide support where representatives will only view and provide guidance for troubleshooting without making any changes. Assuming we want to use the HelpdeskAgents Azure AD Group, we can use the same cmdlet as shown above to assign the Reader Azure RBAC role to the subscription. Providing this permission will allow the users in this group to view all the resources in the subscription (there are some restrictions such as various configuration parameters in app services) but unable to make any changes such as provisioning, editing, or deleting.

image

It is also worth noting that having Reader access to a subscription does not permit the user to open support tickets as an attempt to do so will display the following message during the ticket creation process:

You don’t have permission to create a support request

To get permission, ask your subscription administrator or owner to assign you ‘Support Request Contributor’ role for the selected subscription.

image

----------------------------------------------------------------------------------------------------------------------------

I hope this post helps anyone who may be trying to learn more about how being a CSP can manage their customer’s tenant. The design isn’t overly complex but requires a bit of time to dissect the components and understand how they all interact with each other.

Behavior for Teams for users who are either disabled or deleted in the on-premise Active Directory synced to Azure AD

$
0
0

I recently had a customer ask me what would happen to a Microsoft Teams Team if the owner, which is an on-premise AD account synced into Azure AD, was disabled or deleted and as I did not know off the top of my head, I went ahead to test the scenarios. The following are the results in case anyone may be looking for this information.

On-Premise Active Directory Disabled User

  • Teams channels where the disabled user is the only owner and/or member will not deleted
  • The user will still be listed as owner of Public and Private Teams
  • Disabled status will cause the account to not be displayed when browsing in Manage users
  • Unable to log into Teams with message:

Your account has been locked. Contact your support person to unlock it, then try again.

  • Re-enabling will return back to normal

On-Premise Active Directory Deleted User

  • Teams channels where the disabled user is the only owner will be listed but displayed with an error:

We can’t retrieve information on this team. Refresh the page.

If you continue to have problems, contact Microsoft customer support.

image

  • Teams channels that have other owners or members will continue to be accessible and if there are only members, they can be promoted to be an owner
  • Channel is not deleted
  • User is removed and no longer appears in the Public and Private Teams
  • If the account is restored from the Recycling Bin:
    • The user will be placed back into the Teams with other members
    • The team with only this account will be accessible with the restored user as the owner

Microsoft Teams configuration automation using MS Teams Channel, Forms, Logic App and Automation Account with Runbook

$
0
0

One of the past projects I worked on for a client was to help them move from a Mitel PBX phone system to Microsoft Teams as their phone system. Aside from the design and implementation for the routing during the transition and cutover, a key pain point for their organization was to allow their help desk to easily configure Microsoft Teams users for Enterprise Voice. The Microsoft Teams PowerShell cmdlets required to configure a user can be easily copied and paste as the following sample:

Connect-MicrosoftTeams

$dialPlan = "Toronto"

$voiceRoutingPolicy = "Toronto"

$usernameUPN = "tluk@contoso.com"

$extension = "tel:+141655555390;ext=390"

Set-CsUser -Identity $usernameUPN -EnterpriseVoiceEnabled $true -HostedVoiceMail $true -LineURI $extension

Grant-CsTenantDialPlan -PolicyName $dialPlan -Identity (Get-CsOnlineUser $usernameUPN).SipAddress

Grant-CsOnlineVoiceRoutingPolicy -Identity $usernameUPN -PolicyName $voiceRoutingPolicy

… but this wasn’t very friendly for the helpdesk team. What I ended up presenting to the client was to leverage the existing Microsoft 365 and Azure services they already had to provide a form for the helpdesk team to fill in that would then configure the accounts. The components required for this automation are the following:

  • MS Teams Channel
  • Forms
  • Logic App
  • Automation Account

The organization already used Teamschannels heavily, which meant no training was required for the team when we created a Form that was displayed as an additional tab in a MicrosoftTeamsTeam. Once the form is filled out and submitted, it would send the fields to a Logic App that will then execute a Runbook in and Automation Account to configure the desired user. We started off with a simple form, then added an approval notification, and finally to a customized web page. I did not get to document the settings but would like to replicate part of the configuration in the post to demonstrate the process.

What this post will provide are as follows:

  1. Creating a form displayed as an additional tab in a Microsoft Teams team
  2. Creating a Logic App that will be triggered when a new response is submitted from Microsoft Forms
  3. The Logic App will retrieve the two fields: email address and DID, then use a runbook in an Automation Account to configure the user

Before I begin, this post was written on June 5, 2022 when the Connect-MicrosoftTeams cmdlet has removed the -ApplicationId switch for defining a service principal with the AccessTokens authentication. I spent a full day going through pages of forum and blog posts to realize that none of the modules up to the current 4.4.1 works as Microsoft is still in the process of fixing it. Having exhausted all the avenues including Graph (it does not provide the functionality to configure anything outside of Teams at the moment), I decided use a regular user account with permissions and 2FA disabled. This is absolutely not best practice and I always discourage the use of non-Service Principals or Managed Identities for automation but there does not appear to be an alternate method.

With that, let’s get started!

Creating an embedded Microsoft Form in a Team

Begin by creating a Microsoft Teams Team that will host the Microsoft Form and click on the + sign in the tabs:

image

Click on the Forms icon:

image

Type in a name for the new form and click on the Save button:

image

The new form should now be displayed and can be edited if you click on the fields:

image

With the form created, proceed to close the form in editing mode by click on the Edit | <Form name> and click on the Remove option:

image

The message can be a bit misleading but proceed to remove it as we’ll be re-adding it back in:

image

With the form removed, proceed to click on the + sign in the tab again:

image

Click on the form icon again:

image

We will now use the Add an existing form option to add the form we had just created:

image

The form will now be in Fill mode (non editing):

image

Obtain the Form ID

We will need the form ID for the Logic App to reference so navigate to www.office.com, click on the top left corner tile button and then Forms:

image

The most recent for we created will be displayed but if the recent list contains other documents you have worked on that has pushed the form off the list then we can navigate to it by clicking on the team under My Groups:

image

Proceed to open the form by clicking on it:

image

With the form displayed in the browser, copy the URL and locate the string after the id= string and copy it:

https://forms.office.com/pages/designpagev2.aspx?origin=OfficeDotCom&lang=en-US&route=GroupForms&subpage=design&id=C0f0hB4_iUiflasPUUMCT_1a5E1qzx9IgJMuQa0Fbb9UME5RSlhYWFNLODdDQ1lVUzhINlRVRzBNNiQlQCN0PWcu

image

Create the Logic App

Now that we have the form to collect the user input created, proceed to create a new Logic App:

image

image

Fill in the appropriate fields for the new Logic App:

image

Create the Logic App:

image

With the Logic App created, proceed to navigate to the Logic app designer:

image

Scroll down to the Templates section and create a Blank Logic App:

image

A new blank template will be discovered for steps to be configured:

image

Type in Microsoft Forms in the search field and click on When a new response is submitted:

image

Sign into the tenant to create a connection to Microsoft Forms:

image

Select Enter custom Value for the Form Id:

image

Paste in the form ID we copied previously:

C0f0hB4_iUiflasPUUMCT_1a5E1qzx9IgJMuQa0Fbb9UME5RSlhYWFNLODdDQ1lVUzhINlRVRzBNNiQlQCN0PWcu

image

Proceed to create a new step, type in Get response details in the search field and select the action:

image

image

Paste in the previously copied form Id into the Form Id field and select List of response notifications Response ID for the Response ID field:

image

image

Proceed to save the Logic App:

image

Create Service Principal

I, unfortunately, could not get the Connect-MicrosoftTeams module to use a service principal for authentication but will include the instructions and update this post when it is fixed.

Navigate to App Registrations> New Registration:

image

Provide a name for the App Registration and click on Register:

image

Create a new client secret and document the following fields for the App Registration:

  • Application (client) ID
  • Object ID
  • Directory (tenant) ID
  • Secret ID
  • Secret

image

image

I won’t include the permissions that are required as I’m not sure if the documentation will change when the module is fixed.

Create the Automation Account and Runbook

image

image

image

The automation account will need to store the credentials of the service principal that will run the Microsoft Teams cmdlets. Navigate to the Credentials blade and click on Add a credential:

image

Navigate to the Credentials blade and configure the App Registration / Service Principal credential that the PowerShell script will use to authenticate against Azure AD:

Add the service principal credentials.

**Note that for the purpose of this example, I actually put in a regular user account without 2FA as service principal authentication currently does not work with the Connect-MicrosoftTeams module:

image

The following screenshot shows the service principal credentials configured:

image

We’ll need to reference the tenant ID when the Service Principal authentication with Connect-MicrosoftTeams module works so I will include the steps to create the variable that will be called within the PowerShell script runbook:

image

image

image

The Automation Account will not have the MicrosoftTeams module available by default so click on Browse gallery:

image

Search for MicrosoftTeams and install it:

image

image

image

image

image

With the credentials, variable and module configured, we can now navigate to Runbooks to create a newrunbook:

image

Select PowerShell for Runbook type and 5.1 for Runtime version:

image

In the PowerShell Runbook paste the following code:

<#

    .DESCRIPTION

        Obtain email address and DID and enable user for Teams enterprise voice

    .NOTES

        AUTHOR: Terence Luk

        LASTEDIT: June 4, 2022

#>

Param

(

  [Parameter (Mandatory= $true)]

  [String] $EmailAddress,

  [Parameter (Mandatory= $true)]

  [String] $DID

)

# Connect to Azure with App Registration Service Principal Secret

# Retrieve the App Registration credential (App ID and secret)

$spCredential = Get-AutomationPSCredential -Name 'Teams Administrator Account'

# Retrieve the Azure AD tenant ID

$tenantID = Get-AutomationVariable -Name 'Tenant ID'

Connect-MicrosoftTeams -Credential $spCredential

# Declare variables with for dial plan and voice routing policy (can also be passed by form)

$dialPlan = "Toronto"

$voiceRoutingPolicy = "Toronto"

# Convert DID to e164 format with ;ext= extension format

$extension = $DID.SubString($DID.length - 3, 3)

$e164 = "tel:+1" + $DID + ";ext=" + $extension

Write-Host "Teams DID value:" $e164

Set-CsUser -Identity $EmailAddress -EnterpriseVoiceEnabled $true -HostedVoiceMail $true -LineURI $e164

Grant-CsTenantDialPlan -PolicyName $dialPlan -Identity (Get-CsOnlineUser $EmailAddress).SipAddress

Grant-CsOnlineVoiceRoutingPolicy -Identity $EmailAddress -PolicyName $voiceRoutingPolicy

Get-CsOnlineUser -Identity $EmailAddress | FL *uri*,*voice*,*dial*

image

image

Proceed to publish the runbook:

image

You can test the PowerShell runbook by clicking start as it should ask you for the email address and DID values:

image

image

Complete configuring Logic App

With the Automation Account and Runbook configured, navigate back to the logic app to execute the Runbook when a form is submitted. Proceed by adding an additional step, search for Create Job and click on Create Job under Actions:

image

Fill the values for the Create job step and ensure you configure the Runbook ParameterEmailAddress and Runbook Parameter DID with the form submission:

image

… and that should be it. A user using the form in the Microsoft Teams Team will not be able to configure a licensed user for Enterprise Voice. This automation can be expanded to assigning the appropriate license, other enterprise voice configuration such as dial plan and voice routing policy (these are hard coded in the example), approval, email notification and much more.

Form Demo

imageimage

Hope this helps anyone who might be looking for a demonstration of this!

Replacing Set-CsUser with Set-CsPhoneNumberAssignment for configuring Microsoft Teams users' voice settings

$
0
0

Teams administrations who have been using Set-CsUser (https://docs.microsoft.com/en-us/powershell/module/skype/set-csuser?view=skype-ps) to configure Microsoft Teams users’ voice settings will notice that it has stopped working as Microsoft has indicated early in 2022 that will be deprecated. The replacement for this cmdlet is the Set-CsPhoneNumberAssignment (https://docs.microsoft.com/en-us/powershell/module/teams/set-csphonenumberassignment?view=teams-ps), which has a few changes I would like to quickly highlight.

This first example is the set of Set-CsUser cmdlets I use for a client who has Teams with Direct Routing:

$usernameUPN = "jsmith@contoso.com"

$extension = "tel:+14165550296;ext=296"

Set-CsUser -Identity $usernameUPN -EnterpriseVoiceEnabled $true -HostedVoiceMail $true -LineURI $extension

Grant-CsTenantDialPlan -PolicyName Toronto -Identity (Get-CsOnlineUser $usernameUPN).SipAddress

Grant-CsOnlineVoiceRoutingPolicy -Identity $usernameUPN -PolicyName "Toronto"

Get-CsOnlineUser -Identity $usernameUPN | FL *uri*,*voice*,*dial*

The following is the updated cmdlet that uses Set-CsPhoneNumberAssignment:

$usernameUPN = "jsmith@contoso.com"

$extension = "+14165550296;ext=296"

Set-CsPhoneNumberAssignment -Identity $usernameUPN -PhoneNumber $extension -PhoneNumberType DirectRouting

Grant-CsTenantDialPlan -PolicyName Toronto -Identity (Get-CsOnlineUser $usernameUPN).SipAddress

Grant-CsOnlineVoiceRoutingPolicy -Identity $usernameUPN -PolicyName "Toronto"

Get-CsOnlineUser -Identity $usernameUPN | FL *uri*,*voice*,*dial*

The difference here is that there is no need to set the EnterpriseVoiceEnabled as true because when you assign a phone number the EnterpriseVoiceEnabled flag is automatically set to True. The provided number also no longer accepts tel:.

Hope this provides a quick answer to anyone who may have realized their Set-CsUser cmdlets no longer work.

I will also be updating my Teams import script from this previous post:

PowerShell script for exporting Microsoft Teams user configuration to an Excel and importing user configuration with updated Excel file
http://terenceluk.blogspot.com/2022/05/powershell-script-for-exporting.html


Using Microsoft Graph PowerShell SDK to retrieve Office 365 / Microsoft 365 license usage with friendly names

$
0
0

Those who have been using the Msol online and Azure AD modules may be aware that Microsoft has announced that they will be deprecated after December 2022 this year, which means all the scripts that relies on these modules will need to be updated. This announcement meant one of the scripts that I have often used in the past to automate reports for Office 365 / Microsoft 365 licensing was no longer going to work in 5 months so I thought there would be no better time than now to modernize it. The two options available were:

  1. Microsoft Graph PowerShell SDK
  2. Microsoft Graph API

Given that what I had was already a PowerShell script, I decided to migrate it to option #1 – Microsoft Graph PowerShell SDK.

The task seemed pretty straight forward as the Get-MgSubscribedSku (https://docs.microsoft.com/en-us/powershell/module/microsoft.graph.identity.directorymanagement/get-mgsubscribedsku?view=graph-powershell-1.0) appeared to be exactly what I needed but I later realized that the output did not provided the amount of licenses that were purchased because it was stored in the PrepaidUnits property that displays:

  1. TotalUnits
  2. SuspendedUnits
  3. WarningUnits

Furthermore, the provided SkuPartNumber name wasn’t the friendly name we would see on the portal so simply sending these fields to, say, sales or accounting would likely confuse them.

The script I ended up with at the end was to retrieve a table with the fields provided by Get-MgSubscribedSku, another table that contained the PrepaidUnits property, and finally an imported table from a CSV that contained a list of licenses and their friendly names that was provided by Microsoft here: https://docs.microsoft.com/en-us/azure/active-directory/enterprise-users/licensing-service-plan-reference

The script can be found here at my GitHub repo: https://github.com/terenceluk/Microsoft-365/blob/main/Administration/Get-M365-License-Report.ps1

… and here are some sample outputs:

HTML

Excel

 

How to determine the master image of a Machine Catalog in Citrix Virtual Apps and Desktops / Citrix DaaS

$
0
0

While not very frequent, some of my ex-colleagues ask me would ask me about Citrix Virtual Apps and Desktops from time to time and one of the most common question is how to determine the master image of a Machine Catalog in Citrix Virtual Apps and Desktops / DaaS because there still does not seem to be a way to find this information from the GUI. Those who are familiar with the Citrix portal will be aware of the Template Properties tab shown here:

image

… and while it displays what snapshot the master image virtual machine for this machine catalogue is currently using, it does not indicate the VM name. This appears to have been by-design since the Citrix XenDesktop 5.6 days as shown in one of my older posts here:

How do I find what master image I used for a desktop catalog in Citrix XenDesktop 5.6?
http://terenceluk.blogspot.com/2012/04/how-do-i-find-what-master-image-i-used.html

It was easy to load the PowerShell directly on a Delivery Controller back in the on-premise days but most of the environments I’ve worked in over the past few years has been in Citrix Cloud, which means you’ll need to install the Citrix SDK to remotely connect to Citrix cloud.

Given that I haven’t really written a post on this and I often struggle to remember, this serves as a short write up that I can refer to in the future.

The Virtual Apps and Desktops Remote PowerShell SDK can be downloaded here:

Virtual Apps and Desktops Remote PowerShell SDK
https://www.citrix.com/downloads/citrix-cloud/product-software/xenapp-and-xendesktop-service.html

image

Once installed, launce PowerShell and execute the following command to add the Citrix PowerShell snapins:

asnp citrix*.

Use the following cmdlet to authenticate against Citrix Cloud (a browser popup with a prompt similar to https://citrix.cloud.com will be displayed):

Get-XDAuthentication

After successfully authenticating, you can use the cmdlet Get-ProvScheme to list all of the MachineCatalogs for the tenant or narrow it down by using the ProvisioningSchemeName to reference the machine catalog you want the details for:

Get-ProvScheme -ProvisioningSchemeName “Machine catalog name”

The screen shot below is a sample output and the field we’re interested in is the MasterImageVM with the following output:

MasterImageVM: XDHyp:\HostingUnits\CC Nimble\CTX-CMComApp-CC.vm\CTX-CMComApp-CC Post Vendor Upgrade

07-11-19.snapshot\Post Vendor Upgrade 08-04-20.snapshot\Updated July 28

2021.snapshot\CTX-CMComApp-CC_vm-3432_1.snapshot\Vendor Update July 4 2022.snapshot

The value ending with .vm represents the virtual machine name and in this example the VM is named:

CTX-CMComApp-CC

image

If there are multiple vCenters in the environment, the vCenter hosting the VM can be found via the GUI by navigating into the configured resource:

imageimage

Hope this helps anyone looking for this information.

Create an automated report for Office 365 / Microsoft 365 license usage with friendly names using Azure a Function App and Logic Apps

$
0
0

I received quite a few requests about how to automate the process of my previous post where I described how to use Microsoft Graph PowerShell SDK to generate a Office 365 / Microsoft 365 license report that had friendly names:

Using Microsoft Graph PowerShell SDK to retrieve Office 365 / Microsoft 365 license usage with friendly names
http://terenceluk.blogspot.com/2022/07/using-microsoft-graph-powershell-sdk-to.html

I have to be honest that I haven’t actually converted my old reports with Msolonline so I took the time over the weekend to create a new report and will now demonstrate the process.

Step #1 - Creating a Service Principal for App-Only authentication for Microsoft Graph PowerShell SDK (Connect-MgGraph)

The first step is to set up a App Registration / Service Principal so the Azure Function App can authenticate and sign into Microsoft Graph. The following documentation describes the process of registering the application:

Use app-only authentication with the Microsoft Graph PowerShell SDK
https://docs.microsoft.com/en-us/powershell/microsoftgraph/app-only?view=graph-powershell-1.0&tabs=azure-portal

… but does not include the process of creating a certificate so I will include a few other posts I’ve written which will demonstrate how to create a self-signed certificate (both the .cert and .pfx) for application authentication:

Step #3 – Generate a self-signed certificate for the application that will be authenticating
http://terenceluk.blogspot.com/2021/05/setting-up-app-only-authentication-for.html

Step #2 – Create a self-signed certificate on the local Windows desktop and export it to PFX with the private key
http://terenceluk.blogspot.com/2022/02/creating-service-principal-to-connect.html

The following are screenshots of the process:

Create a new App Registration as such:

image

Copying the following configuration once the App Registration has been created:

  • Application (client) ID
  • Directory (tenant) ID
image

Navigate to the Certificates & secrets blade, click on Upload certificate, select the .cer export of the certificate (this file does not contain the private key), add a description, and upload:

image

With the certificate uploaded, proceed to copy the Thumbprint:

image

The new App Registration will need to be configured with the required permissions by navigating to API permissions, click on App a permission, and select MicrosoftGraph:

image

Select Application permissions:

image

Add the following Application permissions:

  • Directory.Read.All
  • Directory.ReadWrite.All
  • Organization.Read.All
  • Organization.ReadWrite.All
imageimage

The configured permissions should be listed as shown in the screenshot below and before proceeding, make sure you click on Grant admin consent for organization or else the permissions will be come into effect.

**Note that User.Read is a Delegate permission and should already be added by default.

image

The App Registration / Service Principal that will be used to connect to Microsoft Graph should now be in place. You should be able to test authentication on a workstation that has the certificate along with its private key by using the following cmdlet:

Connect-MgGraph -ClientID "xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxxx" -TenantId "xxxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxxxxx" -CertificateThumbprint "3548c8xxxxxxxxxxxxxxxxxxxxxb2f8affa214"

image

Step #2 – Create a Storage Account Container that will provide the CSV file containing the product friendly names

Microsoft provides a CSV file that contains the product friend names from the following document: https://docs.microsoft.com/en-us/azure/active-directory/enterprise-users/licensing-service-plan-reference

Proceed to download the CSV file and place the file into a storage account as such:

image

The function app will retrieve this CSV file and use it to map the friendly names to the licenses.

Step #3 – Create a Function App that will retrieve Office 365 / Microsoft 365 license usage with friendly names and return it in HTML format

With the App Registration / Service Principal in place, the next step is to create a Function App that will retrieve Office 365 / Microsoft 365 license usage with friendly names and return it in HTML format. This Function App collects the data that will in turn be call by a Logic App to generate an email and send the report off to an email address.

image

Proceed to create a Function App with the following parameters:

Publish: Code
Runtimestack: PowerShell Core
Version: 7.2
OperatingSystem: Windows

Configure the rest of the parameters as required by the environment.

image

image

With the Function App created, proceed to configure 2 now application settings that will contain the value of the previously copied Application (client) ID of the service principal and the thumbprint of the certificate we’ll be using to authentication to Microsoft Graph. The names I’ll use for these applications will be:

  • appID
  • WEBSITE_LOAD_CERTIFICATES
image

appId

image

WEBSITE_LOAD_CERTIFICATES

image

With the application settings configured, proceed to upload certificate file which contains the private key (.pfx) that the Function App will use to authenticate and sign into Microsoft Graph:

image

Confirm that the certificate has successfully uploaded and is in a healthy state:

image

Next proceed to configure the requirements.psd1 file in the App files blade so the appropriate modules will be loaded for the PowerShell code in the function app. Note that I choose to import the specific modules required for the cmdlets Connect-MgGraph (Microsoft.Graph.Authentication) and Get-MgSubscribedSku (Microsoft.Graph.Identity.DirectoryManagement) because the Microsoft.Graph modules has 38 sub modules in it and I was not able to get the function app code to run by importing that.

'JoinModule' = '3.*'
'Microsoft.Graph.Identity.DirectoryManagement' = '1.*'
'Microsoft.Graph.Authentication' = '1.*'

image

Application Insights is extremely useful for troubleshooting issues with the Function App so I would highly recommend turning it on:

image

With the prerequisites configured for the Function App, proceed to create the actual function trigger:

image

Select HTTP trigger as the template and provide a meaningful name:

image

With the trigger created, navigate to Code + Test and paste the following code into run.ps1:

https://github.com/terenceluk/Microsoft-365/blob/main/Administration/Get-M365-License-Report-Function.ps1

image

The following are changes you’ll need to apply to the code:

The client name:

image

The storage account URI:

image

With the function app code in place, proceed to use the Test/Run feature to test it. Note that the function app expects the tenant ID to be passed to it so the Body of the test should include the following:

{

"tenant": "xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"

}

image

The following log entries will be displayed if Application Insights is turned on:

image

Confirm the HTTP response code of 200 OK and the HTTP response content results:

image

Step #4 – Create a Logic App that is scheduled and will call the Azure Function App to retrieve the license report and then send it out

With the Azure Function App created and tested, proceed to create the Logic App that will be scheduled, calls the Function App for the HTML license report and then email it out.

image

Navigate to the Logic app designer blade and begin to configure the steps for the Logic App. The following are the steps we’ll be configuring:

image

The first is the Recurrence step that will schedule this logic app to run on the last day of each month:

image

Note that the GUI doesn’t provide the required controls so we’ll be using the JSON code provided in this document for the configuration: https://docs.microsoft.com/en-us/azure/logic-apps/concepts-schedule-automated-recurring-tasks-workflows#run-once-at-last-day-of-the-month

Click on Code View:

image

Then edit the triggers section:

image

Create an additional step by clicking on the + button and select Add an action then type in Function:

image

Select the Function App that was created:

image

Select the trigger that was created:

image

Place the body containing the tenant ID into the Request Body:

{

"tenant": "xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"

}

image

Proceed to create two additional steps:

  1. Initialize variable
  2. Set variable

These two steps will place the retrieved HTML report into the body of the email:

Initialize variable

Name: EmailBody
Type: String
Value: <leave blank>

Set variable

Name: EmailBody
Value: Select the Body

image

Add the last step that will email this report to the email address required:

image

Proceed to use the Run Trigger feature to execute the Logic App and confirm that the report is generated and sent:

image

I hope this helps anyone who may be looking for instructions on how to configure automated reports. The task may seem trivial but it took me a few hours to troubleshoot the issues.

Using Azure Function App and Logic Apps to create an automated report that downloads a CSV, creates a HTML table, sends an email with the HTML table and attaches the CSV

$
0
0

In this post, I would like to demonstrate the following using an Azure Function App and Logic App.

Function App:

  1. Download a CSV file from a URL
  2. Count the amount of records in the table
  3. Convert the CSV data into a table in HTML format
  4. Return a HTML formatted email for delivery

Logic App:

  1. Set up a recurring Logic App that runs at the end of the month
  2. Executes the Function App to retrieve the HTML formatted email report
  3. Download the CSV file from a URL
  4. Send an email with the HTML formatted email report with the CSV as the attachment

I will use a device list downloaded from a Cylance tenant to demonstrate this. For those who are not familiar with Cylance, it is a cyber threat detection software for endpoints such as Windows and Mac operating systems. It is possible to retrieve reports via a URL with a unique token that belongs to an organization’s tenant. Here is what the portal with the URLs look like:

image

We’ll be using the Devices URL with the unique token to retrieve our report:

https://protect-sae1.cylance.com/Reports/ThreatDataReportV1/devices/4B1FFFxxxxxxxxxxx296

The downloaded report will look as such:

image

Step #1 – Create a Function App that will retrieve Cylance Device List and generate and return a HTML email report

Begin by creating a Function App that will retrieve Cylance Device List and return it in HTML format. This Function App collects the data that will in turn be call by a Logic App to generate an email and send the report off to an email address.

image

Proceed to create a Function App with the following parameters:

Publish: Code

Runtimestack: PowerShell Core

Version: 7.2

OperatingSystem: Windows

Configure the rest of the parameters as required by the environment.

image

image

With the Function App created, proceed to create the function trigger:

image

Select HTTP trigger as the template and provide a meaningful name:

image

With the trigger created, navigate to Code + Test and paste the following code into run.ps1:

https://github.com/terenceluk/Microsoft-365/blob/main/Administration/Get-CylanceDeviceReport.ps1

image

The following are changes you’ll need to apply to the code:

The client name:

image

With the function app code in place, proceed to use the Test/Run feature to test it. Note that the function app expects the tenant ID to be passed to it so the Body of the test should include the following:

{

  "token": "xxxxxxxxxxxxxxxxxxxxxxxxxx"

}

image

Confirm the HTTP response code of 200 OK and the HTTP response content results:

image

Step #2 – Create a Logic App that is scheduled and will download the device list CSV file, call the Azure Function App to retrieve the device list report and then send it out with the CSV as an attachment

With the Azure Function App created and tested, proceed to create the Logic App that will be scheduled, download the device list CSV file, call the Function App for the HTML device list report and then email it out.

image

Navigate to the Logic app designer blade and begin to configure the steps for the Logic App. The following are the steps we’ll be configuring:

The first is the Recurrence step that will schedule this logic app to run on the last day of each month:

image

**Note that the GUI doesn’t provide the required controls so we’ll be using the JSON code provided in this document for the configuration: https://docs.microsoft.com/en-us/azure/logic-apps/concepts-schedule-automated-recurring-tasks-workflows#run-once-at-last-day-of-the-month

Click on Code View:

image

Then edit the triggers section:

                },

"recurrence": {

"frequency": "Month",

"interval": 1,

"schedule": {

"monthDays": [-1]

                    }

image

Create an additional step by clicking on the + button and select Add an action then type in Function:

image

Select the Function App that was created:

image

Select the trigger that was created:

image

Place the body containing the tenant ID into the Request Body:

{

  "token": "xxxxxxxxxxxxxxxxxxxxxxxxxx"

}

image

Proceed to create two additional steps:

  1. Initialize variable
  2. Set variable

These two steps will place the retrieved HTML report into the body of the email:

Initialize variable

Name: EmailBody
Type: String
Value: <leave blank>

image

image

Set variable

Name: EmailBody
Value: Select the Body

image

image

Continue to create the HTTP step to download the device list so we can attach the CSV file as an attachment:

image

Configure the following parameters:

Method: GET
URI: Place the URL with the token to download the device list

**Note that I have hardcoded the token into the URI. It is possible to declare this as a variable and pass it into here as well.

image

Add the last step that will email this report to the email address required:

image

Note that we are placing the EmailBody variable we created from the Function App output into the Body, and attaching the Body from the HTTP step as an attachment. The Attachments Name can be any name you prefer.

image

Proceed to use the Run Trigger feature to execute the Logic App and confirm that the report is generated and sent:

image

I hope this helps anyone who may be looking for instructions on how to configure automated reports.

Azure Function App using certificate authentication falls to authenticate when executing Connect-MgGraph

$
0
0

I was recently contacted by someone to inform me that the Function App I provided in my previous post written on July 25, 2022:

Create an automated report for Office 365 / Microsoft 365 license usage with friendly names using Azure a Function App and Logic Apps
http://terenceluk.blogspot.com/2022/07/create-automated-report-for-office-365.html

… no longer worked so I had a look at the Function App and confirmed that it would fail with the following error message:

2022-09-05T10:29:27Z [Error] ERROR: Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.

Exception :

Type : System.IO.FileNotFoundException

Message : Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.

FileName : Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed

TargetSite :

Name : MoveNext

DeclaringType : Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph+<ProcessRecordAsync>d__56, Microsoft.Graph.Authentication, Version=1.11.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35

MemberType : Method

Module : Microsoft.Graph.Authentication.dll

StackTrace :

at Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph.ProcessRecordAsync()

Source : Microsoft.Graph.Authentication

HResult : -2147024894

CategoryInfo : NotSpecified: (:) [Connect-MgGraph], FileNotFoundException

FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph

InvocationInfo :

MyCommand : Connect-MgGraph

ScriptLineNumber : 85

OffsetInLine : 1

HistoryId : 1

ScriptName : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1

Line : Connect-MgGraph -ClientID $appId -TenantId $tenantID -CertificateThumbprint $thumb ## Or -CertificateName "M365-License"

PositionMessage : At C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1:85 char:1

+ Connect-MgGraph -ClientID $appId -TenantId $tenantID -CertificateThum …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\site\wwwroot\HttpTrigger-Get-Licenses

PSCommandPath : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1

InvocationName : Connect-MgGraph

CommandOrigin : Internal

ScriptStackTrace : at <ScriptBlock>, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 85

PipelineIterationInfo :

2022-09-05T10:29:36Z [Error] ERROR: Authentication needed, call Connect-MgGraph.

Exception :

Type : System.Security.Authentication.AuthenticationException

TargetSite :

Name : GetGraphHttpClient

DeclaringType : Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpers

MemberType : Method

Module : Microsoft.Graph.Authentication.dll

StackTrace :

at Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpers.GetGraphHttpClient(InvocationInfo invocationInfo, IAuthContext authContext)

at Microsoft.Graph.PowerShell.Module.BeforeCreatePipeline(InvocationInfo invocationInfo, HttpPipeline& pipeline)

at Microsoft.Graph.PowerShell.Module.CreatePipeline(InvocationInfo invocationInfo, String parameterSetName)

at Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1.ProcessRecordAsync()

Message : Authentication needed, call Connect-MgGraph.

Source : Microsoft.Graph.Authentication

HResult : -2146233087

CategoryInfo : NotSpecified: (:) [Get-MgSubscribedSku_List1], AuthenticationException

FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1

InvocationInfo :

MyCommand : Get-MgSubscribedSku_List1

ScriptLineNumber : 34

OffsetInLine : 1

HistoryId : 1

ScriptName : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1

Line : $licenseUsage = Get-MgSubscribedSku | Select-Object -Property SkuPartNumber,CapabilityStatus,@{Name="PrepaidUnits";expression={$_.PrepaidUnits.Enabled -join ";"}},ConsumedUnits,SkuId,AppliesTo

PositionMessage : At C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1:34 char:1

+ $licenseUsage = Get-MgSubscribedSku | Select-Object -Property SkuPart …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\site\wwwroot\HttpTrigger-Get-Licenses

PSCommandPath : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1

InvocationName : Get-MgSubscribedSku

CommandOrigin : Internal

ScriptStackTrace : at Get-MgSubscribedSku<Process>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Identity.DirectoryManagement\1.10.0\exports\v1.0\ProxyCmdletDefinitions.ps1: line 12245

at Get-LicenseUsage, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 34

at <ScriptBlock>, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 88

PipelineIterationInfo :

image

Going through the logs revealed the two lines below which suggest that there was something wrong with using the Connect-MgGraph cmdlet to connect to Microsoft Graph:

Microsoft.Graph.Authentication, Version=1.11.1.0

2022-09-05T10:29:36Z [Error] ERROR: Authentication needed, call Connect-MgGraph.

I’ve ran into issues like this in the past which resulted in a lot of troubleshooting so I’m glad I’ve already gone through it before and immediately realized it must be because the Microsoft.Graph.Authentication module was updated and either intentionally or unintentionally no longer works with the certificate authentication I’m using. Browsing the PowerShell gallery for Microsoft.Graph (https://www.powershellgallery.com/packages/Microsoft.Graph/1.11.1) shows the latest version at the time of this writing was 1.11.1 and it was released in late August 2022, which the time I wrote the past was back in July. This lead me to believe that the latest 1.11.1 version is the reason for the error.

Reviewing the requirements.psd1 I had created for the Function App shows that any major version 1.* should be used:

'JoinModule' = '3.*'

'Microsoft.Graph.Identity.DirectoryManagement' = '1.*'

'Microsoft.Graph.Authentication' = '1.*'

}

image

To correct the issue, I reviewed the version of the Microsoft.Graph.Authentication module I had on my local computer (version 1.10.0), tested the script locally to confirm it worked, then updated the requirements.psd1 for the function app to specify a specific version:

# 'Az' = '6.*'

'JoinModule' = '3.*'

'Microsoft.Graph.Identity.DirectoryManagement' = '1.10.0'

'Microsoft.Graph.Authentication' = '1.10.0'

}

image

Then as per the Microsoft documentation that explains how to target a specific version (https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell?tabs=portal#dependency-management), navigated into the profile.ps1 file in the App files:

image

.. added the import statement to import the modules:

Import-Module Microsoft.Graph.Identity.DirectoryManagement -RequiredVersion '1.10.0'
Import-Module Microsoft.Graph.Authentication -RequiredVersion '1.10.0'

image

Reference article: https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell?tabs=portal#target-specific-versions

image

Once the above steps are completed, attempting to execute the function app may still fail to run and display the following error message:

2022-09-05T10:23:37Z [Error] ERROR: Assembly with same name is already loaded

Exception :

Type : System.IO.FileLoadException

Message : Assembly with same name is already loaded

TargetSite :

Name : LoadBinaryModule

DeclaringType : Microsoft.PowerShell.Commands.ModuleCmdletBase

MemberType : Method

Module : System.Management.Automation.dll

StackTrace :

at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadBinaryModule(PSModuleInfo parentModule, Boolean trySnapInName, String moduleName, String fileName, Assembly assemblyToLoad, String moduleBase, SessionState ss, ImportModuleOptions options, ManifestProcessingFlags manifestProcessingFlags, String prefix, Boolean loadTypes, Boolean loadFormats, Boolean& found, String shortModuleName, Boolean disableFormatUpdates)

at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadBinaryModule(Boolean trySnapInName, String moduleName, String fileName, Assembly assemblyToLoad, String moduleBase, SessionState ss, ImportModuleOptions options, ManifestProcessingFlags manifestProcessingFlags, String prefix, Boolean loadTypes, Boolean loadFormats, Boolean& found)

at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadModule(PSModuleInfo parentModule, String fileName, String moduleBase, String prefix, SessionState ss, Object privateData, ImportModuleOptions& options, ManifestProcessingFlags manifestProcessingFlags, Boolean& found, Boolean& moduleFileFound)

at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadModule(String fileName, String moduleBase, String prefix, SessionState ss, ImportModuleOptions& options, ManifestProcessingFlags manifestProcessingFlags, Boolean& found)

at Microsoft.PowerShell.Commands.ImportModuleCommand.ImportModule_LocallyViaName(ImportModuleOptions importModuleOptions, String name)

at Microsoft.PowerShell.Commands.ImportModuleCommand.ImportModule_LocallyViaName_WithTelemetry(ImportModuleOptions importModuleOptions, String name)

at Microsoft.PowerShell.Commands.ImportModuleCommand.ProcessRecord()

at System.Management.Automation.CommandProcessor.ProcessRecord()

Source : System.Management.Automation

HResult : -2146232799

CategoryInfo : NotSpecified: (:) [Import-Module], FileLoadException

FullyQualifiedErrorId : System.IO.FileLoadException,Microsoft.PowerShell.Commands.ImportModuleCommand

InvocationInfo :

MyCommand : Import-Module

ScriptLineNumber : 4

OffsetInLine : 9

HistoryId : 1

ScriptName : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1

Line : $null = Import-Module -Name $ModulePath

PositionMessage : At C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:4 char:9

+ $null = Import-Module -Name $ModulePath

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0

PSCommandPath : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1

InvocationName : Import-Module

CommandOrigin : Internal

ScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 4

at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20

2022-09-05T10:23:39Z [Warning] The Function app may be missing a module containing the 'Get-ScriptCmdlet' command definition. If this command belongs to a module available on the PowerShell Gallery, add a reference to this module to requirements.psd1. Make sure this module is compatible with PowerShell 7. For more details, see https://aka.ms/functions-powershell-managed-dependency. If the module is installed but you are still getting this error, try to import the module explicitly by invoking Import-Module just before the command that produces the error: this will not fix the issue but will expose the root cause.

2022-09-05T10:23:39Z [Error] ERROR: The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.

Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

Exception :

Type : System.Management.Automation.CommandNotFoundException

ErrorRecord :

Exception :

Type : System.Management.Automation.ParentContainsErrorRecordException

Message : The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.

Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

HResult : -2146233087

TargetObject : Get-ScriptCmdlet

CategoryInfo : ObjectNotFound: (Get-ScriptCmdlet:String) [], ParentContainsErrorRecordException

FullyQualifiedErrorId : CommandNotFoundException

InvocationInfo :

ScriptLineNumber : 11

OffsetInLine : 36

HistoryId : 1

ScriptName : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1

Line : Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath) -Alias (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath -AsAlias)

PositionMessage : At C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1:11 char:36

+ Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $Cu …

+ ~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts

PSCommandPath : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1

InvocationName : Get-ScriptCmdlet

CommandOrigin : Internal

ScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1: line 11

at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 12

at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 11

at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20

CommandName : Get-ScriptCmdlet

TargetSite :

Name : LookupCommandInfo

DeclaringType : System.Management.Automation.CommandDiscovery, System.Management.Automation, Version=7.0.11.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35

MemberType : Method

Module : System.Management.Automation.dll

StackTrace :

at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandTypes commandTypes, SearchResolutionOptions searchResolutionOptions, CommandOrigin commandOrigin, ExecutionContext context)

at System.Management.Automation.CommandDiscovery.LookupCommandProcessor(String commandName, CommandOrigin commandOrigin, Nullable`1 useLocalScope)

at System.Management.Automation.ExecutionContext.CreateCommand(String command, Boolean dotSource)

at System.Management.Automation.PipelineOps.AddCommand(PipelineProcessor pipe, CommandParameterInternal[] commandElements, CommandBaseAst commandBaseAst, CommandRedirection[] redirections, ExecutionContext context)

at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] commandRedirections, FunctionContext funcContext)

at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)

at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)

at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)

Message : The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.

Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

Data : System.Collections.ListDictionaryInternal

Source : System.Management.Automation

HResult : -2146233087

TargetObject : Get-ScriptCmdlet

CategoryInfo : ObjectNotFound: (Get-ScriptCmdlet:String) [], CommandNotFoundException

FullyQualifiedErrorId : CommandNotFoundException

InvocationInfo :

ScriptLineNumber : 11

OffsetInLine : 36

HistoryId : 1

ScriptName : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1

Line : Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath) -Alias (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath -AsAlias)

PositionMessage : At C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1:11 char:36

+ Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $Cu …

+ ~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts

PSCommandPath : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1

InvocationName : Get-ScriptCmdlet

CommandOrigin : Internal

ScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1: line 11

at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 12

at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 11

at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20

2022-09-05T10:23:39Z [Warning] The Function app may be missing a module containing the 'Get-ModuleCmdlet' command definition. If this command belongs to a module available on the PowerShell Gallery, add a reference to this module to requirements.psd1. Make sure this module is compatible with PowerShell 7. For more details, see https://aka.ms/functions-powershell-managed-dependency. If the module is installed but you are still getting this error, try to import the module explicitly by invoking Import-Module just before the command that produces the error: this will not fix the issue but will expose the root cause.

2022-09-05T10:23:39Z [Error] ERROR: The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.

Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

Exception :

Type : System.Management.Automation.CommandNotFoundException

ErrorRecord :

Exception :

Type : System.Management.Automation.ParentContainsErrorRecordException

Message : The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.

Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

HResult : -2146233087

TargetObject : Get-ModuleCmdlet

CategoryInfo : ObjectNotFound: (Get-ModuleCmdlet:String) [], ParentContainsErrorRecordException

FullyQualifiedErrorId : CommandNotFoundException

InvocationInfo :

ScriptLineNumber : 17

OffsetInLine : 30

HistoryId : 1

ScriptName : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1

Line : Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath) -Alias (Get-ModuleCmdlet -ModulePath $ModulePath -AsAlias)

PositionMessage : At C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:17 char:30

+ Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath …

+ ~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0

PSCommandPath : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1

InvocationName : Get-ModuleCmdlet

CommandOrigin : Internal

ScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 17

at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20

CommandName : Get-ModuleCmdlet

TargetSite :

Name : LookupCommandInfo

DeclaringType : System.Management.Automation.CommandDiscovery, System.Management.Automation, Version=7.0.11.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35

MemberType : Method

Module : System.Management.Automation.dll

StackTrace :

at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandTypes commandTypes, SearchResolutionOptions searchResolutionOptions, CommandOrigin commandOrigin, ExecutionContext context)

at System.Management.Automation.CommandDiscovery.LookupCommandProcessor(String commandName, CommandOrigin commandOrigin, Nullable`1 useLocalScope)

at System.Management.Automation.ExecutionContext.CreateCommand(String command, Boolean dotSource)

at System.Management.Automation.PipelineOps.AddCommand(PipelineProcessor pipe, CommandParameterInternal[] commandElements, CommandBaseAst commandBaseAst, CommandRedirection[] redirections, ExecutionContext context)

at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] commandRedirections, FunctionContext funcContext)

at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)

at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)

at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)

Message : The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.

Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

Data : System.Collections.ListDictionaryInternal

Source : System.Management.Automation

HResult : -2146233087

TargetObject : Get-ModuleCmdlet

CategoryInfo : ObjectNotFound: (Get-ModuleCmdlet:String) [], CommandNotFoundException

FullyQualifiedErrorId : CommandNotFoundException

InvocationInfo :

ScriptLineNumber : 17

OffsetInLine : 30

HistoryId : 1

ScriptName : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1

Line : Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath) -Alias (Get-ModuleCmdlet -ModulePath $ModulePath -AsAlias)

PositionMessage : At C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:17 char:30

+ Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath …

+ ~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0

PSCommandPath : C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1

InvocationName : Get-ModuleCmdlet

CommandOrigin : Internal

ScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 17

at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20

2022-09-05T10:23:40Z [Error] Errors reported while executing profile.ps1. See logs for detailed errors. Profile location: C:\home\site\wwwroot\profile.ps1.

2022-09-05T10:23:40Z [Information] INFORMATION: PowerShell HTTP trigger function processed a request.

2022-09-05T10:23:42Z [Error] ERROR: Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.

Exception :

Type : System.IO.FileNotFoundException

Message : Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.

FileName : Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed

TargetSite :

Name : MoveNext

DeclaringType : Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph+<ProcessRecordAsync>d__56, Microsoft.Graph.Authentication, Version=1.11.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35

MemberType : Method

Module : Microsoft.Graph.Authentication.dll

StackTrace :

at Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph.ProcessRecordAsync()

Source : Microsoft.Graph.Authentication

HResult : -2147024894

CategoryInfo : NotSpecified: (:) [Connect-MgGraph], FileNotFoundException

FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph

InvocationInfo :

MyCommand : Connect-MgGraph

ScriptLineNumber : 85

OffsetInLine : 1

HistoryId : 1

ScriptName : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1

Line : Connect-MgGraph -ClientID $appId -TenantId $tenantID -CertificateThumbprint $thumb ## Or -CertificateName "M365-License"

PositionMessage : At C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1:85 char:1

+ Connect-MgGraph -ClientID $appId -TenantId $tenantID -CertificateThum …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\site\wwwroot\HttpTrigger-Get-Licenses

PSCommandPath : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1

InvocationName : Connect-MgGraph

CommandOrigin : Internal

ScriptStackTrace : at <ScriptBlock>, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 85

PipelineIterationInfo :

2022-09-05T10:23:42Z [Error] ERROR: Authentication needed, call Connect-MgGraph.

Exception :

Type : System.Security.Authentication.AuthenticationException

TargetSite :

Name : GetGraphHttpClient

DeclaringType : Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpers

MemberType : Method

Module : Microsoft.Graph.Authentication.dll

StackTrace :

at Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpers.GetGraphHttpClient(InvocationInfo invocationInfo, IAuthContext authContext)

at Microsoft.Graph.PowerShell.Module.BeforeCreatePipeline(InvocationInfo invocationInfo, HttpPipeline& pipeline)

at Microsoft.Graph.PowerShell.Module.CreatePipeline(InvocationInfo invocationInfo, String parameterSetName)

at Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1.ProcessRecordAsync()

Message : Authentication needed, call Connect-MgGraph.

Source : Microsoft.Graph.Authentication

HResult : -2146233087

CategoryInfo : NotSpecified: (:) [Get-MgSubscribedSku_List1], AuthenticationException

FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1

InvocationInfo :

MyCommand : Get-MgSubscribedSku_List1

ScriptLineNumber : 34

OffsetInLine : 1

HistoryId : 1

ScriptName : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1

Line : $licenseUsage = Get-MgSubscribedSku | Select-Object -Property SkuPartNumber,CapabilityStatus,@{Name="PrepaidUnits";expression={$_.PrepaidUnits.Enabled -join ";"}},ConsumedUnits,SkuId,AppliesTo

PositionMessage : At C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1:34 char:1

+ $licenseUsage = Get-MgSubscribedSku | Select-Object -Property SkuPart …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

PSScriptRoot : C:\home\site\wwwroot\HttpTrigger-Get-Licenses

PSCommandPath : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1

InvocationName : Get-MgSubscribedSku

CommandOrigin : Internal

ScriptStackTrace : at Get-MgSubscribedSku<Process>, C:\home\data\ManagedDependencies\2209042059148689947.r\Microsoft.Graph.Identity.DirectoryManagement\1.10.0\exports\v1.0\ProxyCmdletDefinitions.ps1: line 12245

at Get-LicenseUsage, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 34

at <ScriptBlock>, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 88

PipelineIterationInfo :

image

The error message Assembly with the same name is already loaded is caused by the newer module that has already been downloaded, which is causing the older module we are trying to import to fail. One of the ways to resolve this is remove the new module with Kudu.

Every app that is created has a companion app created for it (https://docs.microsoft.com/en-us/azure/app-service/resources-kudu#access-kudu-for-your-app) and this app named Kudu can be accessed via:

Go to: https://<app-name>.scm.azurewebsites.net

image

You can browse the directories of the app but clicking on Debugconsole> PowerShell:

image

Navigate to: data

image

Then: ManagedDependencies

image

You may find several folders in this directory:

image

Proceed to browse into these folders and you’ll likely see the newer 1.11.1:

imageimage

Using the edit button to review the contents of requirements.psd1 will also show the older version using 1.* for the modules:

image

Proceed to delete the old ManagedDependencies folder:

image

Then delete the newer 1.11.1 folder in the remaining ManagedDependencies folder:

image

With the newer module removed, we should now be able to run the function app:

image

Hope this helps anyone who may have used my previous post and noticed it did not work.

Using an Azure Automation Account Runbook to create and email a Duo report with SendGrid

$
0
0

I received a lot of great feedback for my previous Azure Function App with Logic App posts, which demonstrated how to create HTML reports that were scheduled to be mailed out at the end of the one. A common question that I received after the posts was whether it was possible to create these reports with Azure Automation Accounts and Runbooks. The answer is certainly, yes, and I would like to demonstrate this with another report I had created for providing a count of the amount of users provisioned in Cisco Duo, which is commonly used for providing 2-factor authentication.

Along with using an Azure Automation Account for reports, I would also like to use this post to demonstrate the following:

  1. How to import custom PowerShell Module into an Automation Account (we’ll be using a the Duo PowerShell module that Matt Egan has created and shared via his GitHub: https://github.com/mbegan/Duo-PSModule)
  2. How to use a Automation Account Runbook to generate a report
  3. Storing and retrieving a secret in Azure Key Vault (we’ll be storing the API key for SendGrid in an Azure Key Vault)
  4. Using SendGrid to the report via email

Step #1 – Create an Automation Account

The first step is to create an Automation Account that we’ll be using to host the Runbook to generate the report and use SendGrid to send it out. We’ll be storing the SendGrid API key in an Azure Key Vault and therefore need to grant access to those keys for the Automation Account and that’s why this needs to be created first.

image

image

We’ll need the Automation Account’s System assignedObject (principal) ID to grant it permissions to access the Key Vault secret so navigate to the Identity blade and copy the Object (principal) ID that will be used in the following steps:

image

Step #2 – Create an Azure Key Vault, add SendGrid API Key as Secret, grant Automation Account permissions to read secret

With the Automation Account created, proceed to create an Azure Key Vault if you do not already have one:

image

The preferred way for the Accessconfiguration today is Azure role-based access control so our Automation Account will only be able to access a specific secret. The older Vault access policy requires permissions to be granted to the whole vault, which makes it difficult to secure other secrets in the vault. Proceed to select Azure role-based access control as the Permissionmodel under the AccessPolicy menu:

image

Customize the settings as required or leave them as default and create the Azure Key Vault.

image

The account used to create the new vault will not have any permissions to the vault so attempting to perform any operations areas such as Secrets will display the message: You are unauthorized to view these contents.

image

Proceed to grant the account you’re logged in as through the Access control (IAM) blade > Add role assignment:

image

Search for Key Vault Administrator and grant the permission:

image

image

With the appropriate permissions to the vault assigned, proceed to create a new Secret to store the SendGridAPIKey:

image

image

image

With the secret created, we can now grant the AutomationAccount the ability to retrieve the secret. Click on the SendGrid secret:

image

In the SendGrid secret, navigate to Access Control (IAM) blade, Add, then Add role assignment:

image

image

Select Key Vault Secrets User as the role:

image

Select Managed identity, then locate the Automation Account and select it:

image

Proceed to assign the permissions:

image

The Automation Account should now have permissions to retrieve the SendGrid secret value;

Step #3 – Import the custom Duo PowerShell Module

Rather than attempt to write the PowerShell code required to authenticate with the Duo Admin API (https://duo.com/docs/adminapi) with a HMAC signature, then call the API methods, we’ll be using Matt Egan’s PowerShell module he has shared with the community years ago that still works today https://github.com/mbegan/Duo-PSModule

The Duo PowerShell module Matt Egan provided does not simply upload into Azure Automation’s Modules blade as the psd1 file references to the Duo_org.ps1 file that is mean to store the information required to connect to the Duo API.

Neil Sabol has a great write up that explains this and how to workaround the issue so I’ll be using his method to demonstrate the configuration: https://blog.neilsabol.site/post/importing-duo-psmodule-mfa-powershell-module-azure-automation/

The method I’ll be using is not to upload a blank Duo_org.ps1 file but rather comment all references to it in the Duo.psd1 file. You can find the updated file here in my GitHub: https://github.com/terenceluk/Azure/blob/main/Automation%20Runbook/Duo/Duo.psd1

Proceed to download the Duo.psd1 and Duo.psm1 from my GitHub https://github.com/terenceluk/Azure/tree/main/Automation%20Runbook/Duo, zip them up into a package named Duo.zip (make sure Duo.zip is the file name) then import them into the Automation Account Modules:

imageimage

I haven’t had much luck with 7.1 as the Runtimeversion so proceed to select 5.1 as the Runtimeversion:

image

Initiate the import:

image

Confirm the module has successfully imported:

image

One of the ways to check and see if the module imported properly is by clicking into the module and verify that the available cmdlets are displayed:

image

Step #4 – Create a Protected Application in Duo and add authentication information as Automation Account encrypted variables

Using the Duo Admin API requires authentication so we’ll need to create a protected application in the Duo Admin portal as described in the document here: https://duo.com/docs/adminapi

imageimage

Copy the Integration key, Secret key, and API hostname as we’ll need them to create the encrypted variables in the following steps, and grant the application the required permissions:

image

Proceed to the Automation Account, navigate to the following variables:

  1. MyDuoDirectoryID
  2. MyDuoIntegrationKey
  3. MyDuoSecretKey
  4. MyDuoAPIHostname
image

Step #5 – Create the runbook to generate a report of the users and email via SendGrid

With all the components and permissions created and configured, the last step is to create the runbook and put the code in that will build the report and use the SendGrid API to send the email report. From within the Automation Account, navigate to the Runbooks blade:

image

Click on Create a runbook:

image

Fill in the required fields:

image

The following PowerShell Runbook will be displayed where we can paste the PowerShell script to be executed:

image

The script I will be using to generate and email the report can be found here: https://github.com/terenceluk/Azure/blob/main/Automation%20Runbook/Email-Duo-User-Count-and-List.ps1

Customize the following variables:

$VaultName = "kv-Production" # Azure Key Vault Name

$destEmailAddress = "tluk@contoso.com" # From address

$fromEmailAddress = "duoreport@contoso.com" # To address

$subject = "Duo User Report" # Email Subject

<h2>Client: Contoso Limited</h2>

Proceed to paste the code into the runbook:

image

Proceed to click Save:

image

Before publishing the Runbook, click on the Testpane button to bring up the test window, then click on the Start button to test the runbook:

image

image

Confirm that the test successfully completes:

image

Verify that the email sent contains the expected report as such:

image

Proceed to publish the report once we’ve confirmed that the report is delivered:

image

The last step is to schedule this runbook according to when the report should be ran so proceed to click on the Schedules blade and then Add a schedule:

image

Click on Link a schedule to your runbook:

image

Click on Add a Schedule:

image

Configure the desired schedule (the settings in the screenshot is configured to run on the last day of the month):

image

The schedules pane should now display the configured schedule and its next run time:

image

That’s it. Hope this helps anyone who might be looking for information on how to configure the various components demonstrated in this post.

Securing Azure Function App to require authentication and granting access to a Logic Apps’ Managed Identity

$
0
0

One of the topics I’ve been meaning to write about is how to secure Function Apps with authentication to prevent unauthorized calls because a newly created Function App and its functions are by default accessible via the internet so anyone who knows the URL would be able to call it without authentication. Take one of my previous posts:

Using Azure Function App and Logic Apps to create an automated report that downloads a CSV, creates a HTML table, sends an email with the HTML table and attaches the CSV
http://terenceluk.blogspot.com/2022/08/using-azure-function-app-and-logic-apps.html

… where I created an Azure Function that expects a POST method containing the Cylance Token it can use to download an export of all the devices, generate an HTML report, then return the content. While one can argue that even if the URL was discovered, a malicious attacker would need to know the Cylance Token to obtain any meaningful data back. This is true but a malicious attacker could also cause financial impact by calling the Azure Function repeatedly to cause unnecessary charges.

Below is the Function App:

image

… and the Function within the FunctionApp that contains the PowerShell script to process the request after receiving the token:

image

Confirming Function App does not require authentication with Postman

To demonstrate that this function does not require authentication, we can test by using the Get Function Url:

image

Copy the URL:

image

Then open Postman or any API testing tool, create a post request with the URL:

image

Select No Auth for the Type under Authorization:

image

We’ll be passing a JSON to the Function App so navigate to Headers and uncheck Content-Type:text/plain, then add the following header:

Key: Content-Type
Value: application/json

image

Navigate to the Body section and paste the Cylance token for the report:

image

Proceed to send the request and you’ll see a Status: 200 OK with the returned HTML of the report:

image

Creating a System Managed Identity for the Logic App that calls the Function App

In this example, I’ll be configuring a System Managed Identity for the Logic App that is used to call the Function App to generate the report so it (the Logic App) can take the report and email it out. The System ManagedIdentity will be given permissions to call the Function App and any calls to it without authentication will fail. The use of System Managed Identities allows for the LogicApp to authenticate with an Azure AD identity without having to provide a secret. Azure manages the identity so there are no secrets to rotate / maintain.

**Note that it is also possible to User-assigned Identity as well and the official Microsoft documentation can be found here: https://docs.microsoft.com/en-us/azure/logic-apps/create-managed-service-identity?tabs=consumption

Begin by opening the Logic App that calls the Function App, then navigate to its Identity blade:

image

Under the System assigned heading, switch the Status from Off to On and click on Save:

image

Note the message being displayed indicates that the Logic App will be registered in Azure AD so it can be granted permissions to access other resources protected by Azure AD (in this case it will be the Function App):

image

The process should complete in a few seconds and a Object (principal) ID will be displayed. This attribute represents the Logic App’s Managed Identity:

image

We’ll need the Application ID of the Logic App’s Managed Identity, which isn’t displayed in the Identity blade of the Logic App (don’t mistaken the Object Principal ID for the App ID) and it can be retrieved in the Azure AD Enterprise Application list when Application Type is configured as AllApplications or Managed Identities:

image

Copy the Application ID value as we’ll need it to grant this Logic App permissions to the Function App.

Require Authentication for Function Calls

The first requirement to allow the Logic App’s Managed Identity to authenticate with the Function App is to set the Function App’s authentication level to anonymous. Failure to do so will result in the Logic App workflow throwing a BadRequest error.

Begin by opening the Function App, navigate to the Advanced Tools blade, then click on Go to open Kudu Services:

image

image

Click on the Debug Console menu and select CMD:

image

Navigate to site > wwwroot > <function name>

image

Edit the function.json file and look for authLevel in the bindings. If the property exists, set the property value to anonymous. Otherwise, add that property and set the value as anonymous:

image

The Function App in this example already has authLevel so we just need to update function to anonymous:

image

Clicking the Save button will bring us back to the command prompt:

image

Create an App Registration for the Function App in Azure AD

With the Logic App having a managed identity to authenticate against the FunctionApp, we now need to create a AppRegistration for the FunctionApp in Azure AD so it can be used to grant permissions to the managed identity of the Logic App.

We’ve already copied the Object (principal) ID for the Logic App earlier but will also need the tenant ID of the Azure AD, which can be retrieved by navigating to Azure Active Directory:

image

Then in the Overview blade, copy the TenantID:

image

Navigate back into the Function App and click on the Authentication blade, then the Add an identity provider button:

image

Select Microsoft for the Identity provider so we can use Azure AD identities:

image

In the App registration type section, select Provide the details of an existing app registration:

image

Fill in the following fields:

Application (client)ID: <The Logic App’s Application ID>
Client secret (recommended): <blank>
Issuer URL: https://sts.windows.net/<Tenant-ID>
Allowed token audiences: <The Logic App’s Application ID>
Restrict access: Require authentication
Unauthenticated requests: HTTP 401 Unauthorized: recommended for APIs

   image

Proceed to click Add to save the settings and you should see an identity provider representing the Logic App listed in the Authentication blade of the FunctionApp:

  image

Confirming Function App requires authentication with Postman and Portal

Repeating the steps we performed previously with Postman should now return a Status: 401 Unauthorized and the message:

You do not have permission to view this directory or page.

image

Attempting to navigate to the URL of the Function App should also display the same message:

image

Instead of the default Your Functions 4.0 app is up and running:

image

Updating Logic App to authenticate with Managed Identity

Attempting to immediately trigger the Logic App after the changes will fail and return an Unauthorized response when calling the Azure Function App:

image

To correct the issue, enter the designer mode for the Logic App, edit the Function App step, click on Add new parameter, check the Authentication box:

image

Select Managed identity for the Authentication type:

image

Select System-assigned managed identity for Managed identity and <Logic App’s Managed Identity Application ID> as the Audience:

  image

Proceed to run the trigger and confirm that steps succeed:

image

**Note that one of the common issues I’ve come across is if the Logic App’s Object (principal) ID used as the Application ID when configuring the Azure Function App’s Authentication. Doing so may cause the following error to be thrown when executing the Logic App:

BadRequest. Http request failed as there is an error getting AD OAuth token: 'AADSTS500011: The resource principal named a6486475-xxx-xxx-xxxx-6exxxxxx882f was not found in the tenant named Contoso. This can happen if the application has not been installed by the administrator of the tenant or consented to by any user in the tenant. You might have sent your authentication request to the wrong tenant. Trace ID: a31a44e7-7135-450f-a420-aa801a8ec003 Correlation ID: 58f958c1-6a75-4fef-bc75-bb22dae4c8c9 Timestamp: 2022-09-11 20:48:23Z'.

image

Ensure that you had used the Application ID for the appropriate fields and this error should be resolved.

I hope this helps anyone who might be looking for a walkthrough for securing an Azure Function App and using Managed Identity for a Logic App to call the function. The Microsoft documentation today isn’t very clear to me so I hope this will provide some clarity.


Azure Function App fails with: “ERROR: Assembly with same name is already loaded”

$
0
0

I was recently notified by a colleague that the Azure Function I had demonstrated in my previous post:

Create an automated report for Office 365 / Microsoft 365 license usage with friendly names using Azure a Function App and Logic Apps
http://terenceluk.blogspot.com/2022/07/create-automated-report-for-office-365.html

… would fail and no longer report the licenses and the error messages are as follow:

2022-10-05T10:22:18 Welcome, you are now connected to log-streaming service. The default timeout is 2 hours. Change the timeout with the App Setting SCM_LOGSTREAM_TIMEOUT (in seconds).

2022-10-05T10:22:29.646 [Error] ERROR: Assembly with same name is already loadedException :Type : System.IO.FileLoadExceptionMessage : Assembly with same name is already loadedTargetSite :Name : LoadBinaryModuleDeclaringType : Microsoft.PowerShell.Commands.ModuleCmdletBaseMemberType : MethodModule : System.Management.Automation.dllStackTrace :at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadBinaryModule(PSModuleInfo parentModule, Boolean trySnapInName, String moduleName, String fileName, Assembly assemblyToLoad, String moduleBase, SessionState ss, ImportModuleOptions options, ManifestProcessingFlags manifestProcessingFlags, String prefix, Boolean loadTypes, Boolean loadFormats, Boolean& found, String shortModuleName, Boolean disableFormatUpdates)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadBinaryModule(Boolean trySnapInName, String moduleName, String fileName, Assembly assemblyToLoad, String moduleBase, SessionState ss, ImportModuleOptions options, ManifestProcessingFlags manifestProcessingFlags, String prefix, Boolean loadTypes, Boolean loadFormats, Boolean& found)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadModule(PSModuleInfo parentModule, String fileName, String moduleBase, String prefix, SessionState ss, Object privateData, ImportModuleOptions& options, ManifestProcessingFlags manifestProcessingFlags, Boolean& found, Boolean& moduleFileFound)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadModule(String fileName, String moduleBase, String prefix, SessionState ss, ImportModuleOptions& options, ManifestProcessingFlags manifestProcessingFlags, Boolean& found)at Microsoft.PowerShell.Commands.ImportModuleCommand.ImportModule_LocallyViaName(ImportModuleOptions importModuleOptions, String name)at Microsoft.PowerShell.Commands.ImportModuleCommand.ImportModule_LocallyViaName_WithTelemetry(ImportModuleOptions importModuleOptions, String name)at Microsoft.PowerShell.Commands.ImportModuleCommand.ProcessRecord()at System.Management.Automation.CommandProcessor.ProcessRecord()Source : System.Management.AutomationHResult : -2146232799CategoryInfo : NotSpecified: (:) [Import-Module], FileLoadExceptionFullyQualifiedErrorId : System.IO.FileLoadException,Microsoft.PowerShell.Commands.ImportModuleCommandInvocationInfo :MyCommand : Import-ModuleScriptLineNumber : 4OffsetInLine : 9HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1Line : $null = Import-Module -Name $ModulePathPositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:4 char:9+ $null = Import-Module -Name $ModulePath+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0PSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1InvocationName : Import-ModuleCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 4at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException : Result: ERROR: Assembly with same name is already loadedException :Type : System.IO.FileLoadExceptionMessage : Assembly with same name is already loadedTargetSite :Name : LoadBinaryModuleDeclaringType : Microsoft.PowerShell.Commands.ModuleCmdletBaseMemberType : MethodModule : System.Management.Automation.dllStackTrace :at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadBinaryModule(PSModuleInfo parentModule, Boolean trySnapInName, String moduleName, String fileName, Assembly assemblyToLoad, String moduleBase, SessionState ss, ImportModuleOptions options, ManifestProcessingFlags manifestProcessingFlags, String prefix, Boolean loadTypes, Boolean loadFormats, Boolean& found, String shortModuleName, Boolean disableFormatUpdates)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadBinaryModule(Boolean trySnapInName, String moduleName, String fileName, Assembly assemblyToLoad, String moduleBase, SessionState ss, ImportModuleOptions options, ManifestProcessingFlags manifestProcessingFlags, String prefix, Boolean loadTypes, Boolean loadFormats, Boolean& found)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadModule(PSModuleInfo parentModule, String fileName, String moduleBase, String prefix, SessionState ss, Object privateData, ImportModuleOptions& options, ManifestProcessingFlags manifestProcessingFlags, Boolean& found, Boolean& moduleFileFound)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadModule(String fileName, String moduleBase, String prefix, SessionState ss, ImportModuleOptions& options, ManifestProcessingFlags manifestProcessingFlags, Boolean& found)at Microsoft.PowerShell.Commands.ImportModuleCommand.ImportModule_LocallyViaName(ImportModuleOptions importModuleOptions, String name)at Microsoft.PowerShell.Commands.ImportModuleCommand.ImportModule_LocallyViaName_WithTelemetry(ImportModuleOptions importModuleOptions, String name)at Microsoft.PowerShell.Commands.ImportModuleCommand.ProcessRecord()at System.Management.Automation.CommandProcessor.ProcessRecord()Source : System.Management.AutomationHResult : -2146232799CategoryInfo : NotSpecified: (:) [Import-Module], FileLoadExceptionFullyQualifiedErrorId : System.IO.FileLoadException,Microsoft.PowerShell.Commands.ImportModuleCommandInvocationInfo :MyCommand : Import-ModuleScriptLineNumber : 4OffsetInLine : 9HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1Line : $null = Import-Module -Name $ModulePathPositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:4 char:9+ $null = Import-Module -Name $ModulePath+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0PSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1InvocationName : Import-ModuleCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 4at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20Exception: Assembly with same name is already loadedStack: at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadBinaryModule(PSModuleInfo parentModule, Boolean trySnapInName, String moduleName, String fileName, Assembly assemblyToLoad, String moduleBase, SessionState ss, ImportModuleOptions options, ManifestProcessingFlags manifestProcessingFlags, String prefix, Boolean loadTypes, Boolean loadFormats, Boolean& found, String shortModuleName, Boolean disableFormatUpdates)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadBinaryModule(Boolean trySnapInName, String moduleName, String fileName, Assembly assemblyToLoad, String moduleBase, SessionState ss, ImportModuleOptions options, ManifestProcessingFlags manifestProcessingFlags, String prefix, Boolean loadTypes, Boolean loadFormats, Boolean& found)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadModule(PSModuleInfo parentModule, String fileName, String moduleBase, String prefix, SessionState ss, Object privateData, ImportModuleOptions& options, ManifestProcessingFlags manifestProcessingFlags, Boolean& found, Boolean& moduleFileFound)at Microsoft.PowerShell.Commands.ModuleCmdletBase.LoadModule(String fileName, String moduleBase, String prefix, SessionState ss, ImportModuleOptions& options, ManifestProcessingFlags manifestProcessingFlags, Boolean& found)at Microsoft.PowerShell.Commands.ImportModuleCommand.ImportModule_LocallyViaName(ImportModuleOptions importModuleOptions, String name)at Microsoft.PowerShell.Commands.ImportModuleCommand.ImportModule_LocallyViaName_WithTelemetry(ImportModuleOptions importModuleOptions, String name)at Microsoft.PowerShell.Commands.ImportModuleCommand.ProcessRecord()at System.Management.Automation.CommandProcessor.ProcessRecord()

2022-10-05T10:22:31.425 [Warning] The Function app may be missing a module containing the 'Get-ScriptCmdlet' command definition. If this command belongs to a module available on the PowerShell Gallery, add a reference to this module to requirements.psd1. Make sure this module is compatible with PowerShell 7. For more details, see https://aka.ms/functions-powershell-managed-dependency. If the module is installed but you are still getting this error, try to import the module explicitly by invoking Import-Module just before the command that produces the error: this will not fix the issue but will expose the root cause.

2022-10-05T10:22:31.638 [Error] ERROR: The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Exception :Type : System.Management.Automation.CommandNotFoundExceptionErrorRecord :Exception :Type : System.Management.Automation.ParentContainsErrorRecordExceptionMessage : The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.HResult : -2146233087TargetObject : Get-ScriptCmdletCategoryInfo : ObjectNotFound: (Get-ScriptCmdlet:String) [], ParentContainsErrorRecordExceptionFullyQualifiedErrorId : CommandNotFoundExceptionInvocationInfo :ScriptLineNumber : 11OffsetInLine : 36HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1Line : Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath) -Alias (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath -AsAlias)PositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1:11 char:36+ Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $Cu …+ ~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScriptsPSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1InvocationName : Get-ScriptCmdletCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1: line 11at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 12at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 11at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20CommandName : Get-ScriptCmdletTargetSite :Name : LookupCommandInfoDeclaringType : System.Management.Automation.CommandDiscovery, System.Management.Automation, Version=7.0.9.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35MemberType : MethodModule : System.Management.Automation.dllStackTrace :at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandTypes commandTypes, SearchResolutionOptions searchResolutionOptions, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin)at System.Management.Automation.CommandDiscovery.LookupCommandProcessor(String commandName, CommandOrigin commandOrigin, Nullable`1 useLocalScope)at System.Management.Automation.ExecutionContext.CreateCommand(String command, Boolean dotSource)at System.Management.Automation.PipelineOps.AddCommand(PipelineProcessor pipe, CommandParameterInternal[] commandElements, CommandBaseAst commandBaseAst, CommandRedirection[] redirections, ExecutionContext context)at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] commandRedirections, FunctionContext funcContext)at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)Message : The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Data : System.Collections.ListDictionaryInternalSource : System.Management.AutomationHResult : -2146233087TargetObject : Get-ScriptCmdletCategoryInfo : ObjectNotFound: (Get-ScriptCmdlet:String) [], CommandNotFoundExceptionFullyQualifiedErrorId : CommandNotFoundExceptionInvocationInfo :ScriptLineNumber : 11OffsetInLine : 36HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1Line : Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath) -Alias (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath -AsAlias)PositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1:11 char:36+ Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $Cu …+ ~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScriptsPSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1InvocationName : Get-ScriptCmdletCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1: line 11at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 12at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 11at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException : Result: ERROR: The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Exception :Type : System.Management.Automation.CommandNotFoundExceptionErrorRecord :Exception :Type : System.Management.Automation.ParentContainsErrorRecordExceptionMessage : The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.HResult : -2146233087TargetObject : Get-ScriptCmdletCategoryInfo : ObjectNotFound: (Get-ScriptCmdlet:String) [], ParentContainsErrorRecordExceptionFullyQualifiedErrorId : CommandNotFoundExceptionInvocationInfo :ScriptLineNumber : 11OffsetInLine : 36HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1Line : Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath) -Alias (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath -AsAlias)PositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1:11 char:36+ Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $Cu …+ ~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScriptsPSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1InvocationName : Get-ScriptCmdletCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1: line 11at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 12at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 11at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20CommandName : Get-ScriptCmdletTargetSite :Name : LookupCommandInfoDeclaringType : System.Management.Automation.CommandDiscovery, System.Management.Automation, Version=7.0.9.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35MemberType : MethodModule : System.Management.Automation.dllStackTrace :at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandTypes commandTypes, SearchResolutionOptions searchResolutionOptions, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin)at System.Management.Automation.CommandDiscovery.LookupCommandProcessor(String commandName, CommandOrigin commandOrigin, Nullable`1 useLocalScope)at System.Management.Automation.ExecutionContext.CreateCommand(String command, Boolean dotSource)at System.Management.Automation.PipelineOps.AddCommand(PipelineProcessor pipe, CommandParameterInternal[] commandElements, CommandBaseAst commandBaseAst, CommandRedirection[] redirections, ExecutionContext context)at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] commandRedirections, FunctionContext funcContext)at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)Message : The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Data : System.Collections.ListDictionaryInternalSource : System.Management.AutomationHResult : -2146233087TargetObject : Get-ScriptCmdletCategoryInfo : ObjectNotFound: (Get-ScriptCmdlet:String) [], CommandNotFoundExceptionFullyQualifiedErrorId : CommandNotFoundExceptionInvocationInfo :ScriptLineNumber : 11OffsetInLine : 36HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1Line : Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath) -Alias (Get-ScriptCmdlet -ScriptFolder $CustomScriptPath -AsAlias)PositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1:11 char:36+ Export-ModuleMember -Function (Get-ScriptCmdlet -ScriptFolder $Cu …+ ~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScriptsPSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1InvocationName : Get-ScriptCmdletCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\StartupScripts\ExportCustomCommands.ps1: line 11at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 12at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 11at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20Exception: The term 'Get-ScriptCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Stack: at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandTypes commandTypes, SearchResolutionOptions searchResolutionOptions, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin)at System.Management.Automation.CommandDiscovery.LookupCommandProcessor(String commandName, CommandOrigin commandOrigin, Nullable`1 useLocalScope)at System.Management.Automation.ExecutionContext.CreateCommand(String command, Boolean dotSource)at System.Management.Automation.PipelineOps.AddCommand(PipelineProcessor pipe, CommandParameterInternal[] commandElements, CommandBaseAst commandBaseAst, CommandRedirection[] redirections, ExecutionContext context)at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] commandRedirections, FunctionContext funcContext)at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)

2022-10-05T10:22:31.734 [Warning] The Function app may be missing a module containing the 'Get-ModuleCmdlet' command definition. If this command belongs to a module available on the PowerShell Gallery, add a reference to this module to requirements.psd1. Make sure this module is compatible with PowerShell 7. For more details, see https://aka.ms/functions-powershell-managed-dependency. If the module is installed but you are still getting this error, try to import the module explicitly by invoking Import-Module just before the command that produces the error: this will not fix the issue but will expose the root cause.

2022-10-05T10:22:32.075 [Error] ERROR: The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Exception :Type : System.Management.Automation.CommandNotFoundExceptionErrorRecord :Exception :Type : System.Management.Automation.ParentContainsErrorRecordExceptionMessage : The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.HResult : -2146233087TargetObject : Get-ModuleCmdletCategoryInfo : ObjectNotFound: (Get-ModuleCmdlet:String) [], ParentContainsErrorRecordExceptionFullyQualifiedErrorId : CommandNotFoundExceptionInvocationInfo :ScriptLineNumber : 17OffsetInLine : 30HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1Line : Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath) -Alias (Get-ModuleCmdlet -ModulePath $ModulePath -AsAlias)PositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:17 char:30+ Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath …+ ~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0PSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1InvocationName : Get-ModuleCmdletCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 17at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20CommandName : Get-ModuleCmdletTargetSite :Name : LookupCommandInfoDeclaringType : System.Management.Automation.CommandDiscovery, System.Management.Automation, Version=7.0.9.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35MemberType : MethodModule : System.Management.Automation.dllStackTrace :at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandTypes commandTypes, SearchResolutionOptions searchResolutionOptions, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin)at System.Management.Automation.CommandDiscovery.LookupCommandProcessor(String commandName, CommandOrigin commandOrigin, Nullable`1 useLocalScope)at System.Management.Automation.ExecutionContext.CreateCommand(String command, Boolean dotSource)at System.Management.Automation.PipelineOps.AddCommand(PipelineProcessor pipe, CommandParameterInternal[] commandElements, CommandBaseAst commandBaseAst, CommandRedirection[] redirections, ExecutionContext context)at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] commandRedirections, FunctionContext funcContext)at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)Message : The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Data : System.Collections.ListDictionaryInternalSource : System.Management.AutomationHResult : -2146233087TargetObject : Get-ModuleCmdletCategoryInfo : ObjectNotFound: (Get-ModuleCmdlet:String) [], CommandNotFoundExceptionFullyQualifiedErrorId : CommandNotFoundExceptionInvocationInfo :ScriptLineNumber : 17OffsetInLine : 30HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1Line : Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath) -Alias (Get-ModuleCmdlet -ModulePath $ModulePath -AsAlias)PositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:17 char:30+ Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath …+ ~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0PSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1InvocationName : Get-ModuleCmdletCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 17at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException : Result: ERROR: The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Exception :Type : System.Management.Automation.CommandNotFoundExceptionErrorRecord :Exception :Type : System.Management.Automation.ParentContainsErrorRecordExceptionMessage : The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.HResult : -2146233087TargetObject : Get-ModuleCmdletCategoryInfo : ObjectNotFound: (Get-ModuleCmdlet:String) [], ParentContainsErrorRecordExceptionFullyQualifiedErrorId : CommandNotFoundExceptionInvocationInfo :ScriptLineNumber : 17OffsetInLine : 30HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1Line : Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath) -Alias (Get-ModuleCmdlet -ModulePath $ModulePath -AsAlias)PositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:17 char:30+ Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath …+ ~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0PSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1InvocationName : Get-ModuleCmdletCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 17at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20CommandName : Get-ModuleCmdletTargetSite :Name : LookupCommandInfoDeclaringType : System.Management.Automation.CommandDiscovery, System.Management.Automation, Version=7.0.9.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35MemberType : MethodModule : System.Management.Automation.dllStackTrace :at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandTypes commandTypes, SearchResolutionOptions searchResolutionOptions, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin)at System.Management.Automation.CommandDiscovery.LookupCommandProcessor(String commandName, CommandOrigin commandOrigin, Nullable`1 useLocalScope)at System.Management.Automation.ExecutionContext.CreateCommand(String command, Boolean dotSource)at System.Management.Automation.PipelineOps.AddCommand(PipelineProcessor pipe, CommandParameterInternal[] commandElements, CommandBaseAst commandBaseAst, CommandRedirection[] redirections, ExecutionContext context)at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] commandRedirections, FunctionContext funcContext)at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)Message : The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Data : System.Collections.ListDictionaryInternalSource : System.Management.AutomationHResult : -2146233087TargetObject : Get-ModuleCmdletCategoryInfo : ObjectNotFound: (Get-ModuleCmdlet:String) [], CommandNotFoundExceptionFullyQualifiedErrorId : CommandNotFoundExceptionInvocationInfo :ScriptLineNumber : 17OffsetInLine : 30HistoryId : 1ScriptName : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1Line : Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath) -Alias (Get-ModuleCmdlet -ModulePath $ModulePath -AsAlias)PositionMessage : At C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1:17 char:30+ Export-ModuleMember -Cmdlet (Get-ModuleCmdlet -ModulePath $ModulePath …+ ~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0PSCommandPath : C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1InvocationName : Get-ModuleCmdletCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Authentication\1.10.0\Microsoft.Graph.Authentication.psm1: line 17at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 20Exception: The term 'Get-ModuleCmdlet' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Stack: at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandTypes commandTypes, SearchResolutionOptions searchResolutionOptions, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin, ExecutionContext context)at System.Management.Automation.CommandDiscovery.LookupCommandInfo(String commandName, CommandOrigin commandOrigin)at System.Management.Automation.CommandDiscovery.LookupCommandProcessor(String commandName, CommandOrigin commandOrigin, Nullable`1 useLocalScope)at System.Management.Automation.ExecutionContext.CreateCommand(String command, Boolean dotSource)at System.Management.Automation.PipelineOps.AddCommand(PipelineProcessor pipe, CommandParameterInternal[] commandElements, CommandBaseAst commandBaseAst, CommandRedirection[] redirections, ExecutionContext context)at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] commandRedirections, FunctionContext funcContext)at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)

2022-10-05T10:22:32.509 [Error] Errors reported while executing profile.ps1. See logs for detailed errors. Profile location: C:\home\site\wwwroot\profile.ps1.

2022-10-05T10:22:32.611 [Information] INFORMATION: PowerShell HTTP trigger function processed a request.

2022-10-05T10:22:34.589 [Error] ERROR: Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.Exception :Type : System.IO.FileNotFoundExceptionMessage : Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.FileName : Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeedTargetSite :Name : MoveNextDeclaringType : Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph+<ProcessRecordAsync>d__56, Microsoft.Graph.Authentication, Version=1.12.3.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35MemberType : MethodModule : Microsoft.Graph.Authentication.dllStackTrace :at Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph.ProcessRecordAsync()Source : Microsoft.Graph.AuthenticationHResult : -2147024894CategoryInfo : NotSpecified: (:) [Connect-MgGraph], FileNotFoundExceptionFullyQualifiedErrorId : Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraphInvocationInfo :MyCommand : Connect-MgGraphScriptLineNumber : 85OffsetInLine : 1HistoryId : 1ScriptName : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1Line : Connect-MgGraph -ClientID $appId -TenantId $tenantID -CertificateThumbprint $thumb ## Or -CertificateName "M365-License"PositionMessage : At C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1:85 char:1+ Connect-MgGraph -ClientID $appId -TenantId $tenantID -CertificateThum …+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\site\wwwroot\HttpTrigger-Get-LicensesPSCommandPath : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1InvocationName : Connect-MgGraphCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 85PipelineIterationInfo :Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException : Result: ERROR: Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.Exception :Type : System.IO.FileNotFoundExceptionMessage : Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.FileName : Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeedTargetSite :Name : MoveNextDeclaringType : Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph+<ProcessRecordAsync>d__56, Microsoft.Graph.Authentication, Version=1.12.3.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35MemberType : MethodModule : Microsoft.Graph.Authentication.dllStackTrace :at Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph.ProcessRecordAsync()Source : Microsoft.Graph.AuthenticationHResult : -2147024894CategoryInfo : NotSpecified: (:) [Connect-MgGraph], FileNotFoundExceptionFullyQualifiedErrorId : Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraphInvocationInfo :MyCommand : Connect-MgGraphScriptLineNumber : 85OffsetInLine : 1HistoryId : 1ScriptName : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1Line : Connect-MgGraph -ClientID $appId -TenantId $tenantID -CertificateThumbprint $thumb ## Or -CertificateName "M365-License"PositionMessage : At C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1:85 char:1+ Connect-MgGraph -ClientID $appId -TenantId $tenantID -CertificateThum …+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\site\wwwroot\HttpTrigger-Get-LicensesPSCommandPath : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1InvocationName : Connect-MgGraphCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 85PipelineIterationInfo :Exception: Could not load file or assembly 'Newtonsoft.Json, Version=13.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. The system cannot find the file specified.Stack: at Microsoft.Graph.PowerShell.Authentication.Cmdlets.ConnectMgGraph.ProcessRecordAsync()

2022-10-05T10:22:35.196 [Error] ERROR: Authentication needed, call Connect-MgGraph.Exception :Type : System.Security.Authentication.AuthenticationExceptionTargetSite :Name : GetGraphHttpClientDeclaringType : Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpersMemberType : MethodModule : Microsoft.Graph.Authentication.dllStackTrace :at Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpers.GetGraphHttpClient(InvocationInfo invocationInfo, IAuthContext authContext)at Microsoft.Graph.PowerShell.Module.BeforeCreatePipeline(InvocationInfo invocationInfo, HttpPipeline& pipeline)at Microsoft.Graph.PowerShell.Module.CreatePipeline(InvocationInfo invocationInfo, String parameterSetName)at Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1.ProcessRecordAsync()Message : Authentication needed, call Connect-MgGraph.Source : Microsoft.Graph.AuthenticationHResult : -2146233087CategoryInfo : NotSpecified: (:) [Get-MgSubscribedSku_List1], AuthenticationExceptionFullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1InvocationInfo :MyCommand : Get-MgSubscribedSku_List1ScriptLineNumber : 34OffsetInLine : 1HistoryId : 1ScriptName : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1Line : $licenseUsage = Get-MgSubscribedSku | Select-Object -Property SkuPartNumber,CapabilityStatus,@{Name="PrepaidUnits";expression={$_.PrepaidUnits.Enabled -join ";"}},ConsumedUnits,SkuId,AppliesToPositionMessage : At C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1:34 char:1+ $licenseUsage = Get-MgSubscribedSku | Select-Object -Property SkuPart …+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\site\wwwroot\HttpTrigger-Get-LicensesPSCommandPath : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1InvocationName : Get-MgSubscribedSkuCommandOrigin : InternalScriptStackTrace : at Get-MgSubscribedSku<Process>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Identity.DirectoryManagement\1.10.0\exports\v1.0\ProxyCmdletDefinitions.ps1: line 12245at Get-LicenseUsage, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 34at <ScriptBlock>, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 88PipelineIterationInfo :Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException : Result: ERROR: Authentication needed, call Connect-MgGraph.Exception :Type : System.Security.Authentication.AuthenticationExceptionTargetSite :Name : GetGraphHttpClientDeclaringType : Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpersMemberType : MethodModule : Microsoft.Graph.Authentication.dllStackTrace :at Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpers.GetGraphHttpClient(InvocationInfo invocationInfo, IAuthContext authContext)at Microsoft.Graph.PowerShell.Module.BeforeCreatePipeline(InvocationInfo invocationInfo, HttpPipeline& pipeline)at Microsoft.Graph.PowerShell.Module.CreatePipeline(InvocationInfo invocationInfo, String parameterSetName)at Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1.ProcessRecordAsync()Message : Authentication needed, call Connect-MgGraph.Source : Microsoft.Graph.AuthenticationHResult : -2146233087CategoryInfo : NotSpecified: (:) [Get-MgSubscribedSku_List1], AuthenticationExceptionFullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1InvocationInfo :MyCommand : Get-MgSubscribedSku_List1ScriptLineNumber : 34OffsetInLine : 1HistoryId : 1ScriptName : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1Line : $licenseUsage = Get-MgSubscribedSku | Select-Object -Property SkuPartNumber,CapabilityStatus,@{Name="PrepaidUnits";expression={$_.PrepaidUnits.Enabled -join ";"}},ConsumedUnits,SkuId,AppliesToPositionMessage : At C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1:34 char:1+ $licenseUsage = Get-MgSubscribedSku | Select-Object -Property SkuPart …+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\site\wwwroot\HttpTrigger-Get-LicensesPSCommandPath : C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1InvocationName : Get-MgSubscribedSkuCommandOrigin : InternalScriptStackTrace : at Get-MgSubscribedSku<Process>, C:\home\data\ManagedDependencies\2210051007461207219.r\Microsoft.Graph.Identity.DirectoryManagement\1.10.0\exports\v1.0\ProxyCmdletDefinitions.ps1: line 12245at Get-LicenseUsage, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 34at <ScriptBlock>, C:\home\site\wwwroot\HttpTrigger-Get-Licenses\run.ps1: line 88PipelineIterationInfo :Exception: Authentication needed, call Connect-MgGraph.Stack: at Microsoft.Graph.PowerShell.Authentication.Helpers.HttpHelpers.GetGraphHttpClient(InvocationInfo invocationInfo, IAuthContext authContext)at Microsoft.Graph.PowerShell.Module.BeforeCreatePipeline(InvocationInfo invocationInfo, HttpPipeline& pipeline)at Microsoft.Graph.PowerShell.Module.CreatePipeline(InvocationInfo invocationInfo, String parameterSetName)at Microsoft.Graph.PowerShell.Cmdlets.GetMgSubscribedSku_List1.ProcessRecordAsync()

2022-10-05T10:22:36.420 [Information] Executed 'Functions.HttpTrigger-Get-Licenses' (Succeeded, Id=67e43323-4ca9-418d-af47-56714cacc89a, Duration=23871ms)

image

image

image

image

image

image

image

image

image

image

image

image

image

I intentionally highlighted the error message:

ERROR: Assembly with same name is already loaded

… because a bit of troubleshooting lead me to confirm that this is an issue caused by the Azure Function App downloading a newer version of the modules defined in the requirements.psd1 file:

clip_image002

To correct the issue, we can Kudu to remove the module.

Every app that is created has a companion app created for it (https://docs.microsoft.com/en-us/azure/app-service/resources-kudu#access-kudu-for-your-app) and this app named Kudu can be accessed via:

Go to: .scm.azurewebsites.net">https://<app-name>.scm.azurewebsites.net

clip_image002[4]

You can browse the directories of the app but clicking on Debugconsole> PowerShell:

clip_image002[6]

Navigate to: data

clip_image002[8]

Then: ManagedDependencies

clip_image002[10]

You may find several folders in this directory:

clip_image002[12]

Proceed to browse into these folders and you’ll likely two versions downloaded:

clip_image002[14]

Proceed to remove the module that is not needed, which will correct the issue.

Troubleshooting traffic blocked by Azure Front Door WAF Policy in Prevention mode

$
0
0

I recently had to troubleshoot an issue with an Azure Front Door WAF policy we had just changed from Detection to Prevention and thought I’d share some steps I used to troubleshoot and have this blog post for me to reference in the future as I’m bound to forget some of the steps.

Before you begin, have a look at the following Microsoft documentation to understand how to use Log Analytics to review WAF logs in a Log Analytics workspace:

Use Log Analytics to examine Application Gateway Web Application Firewall (WAF) Logs
https://learn.microsoft.com/en-us/azure/application-gateway/log-analytics

With that, let’s have a look at the process I went through.

Scenario

The developer notified me after the Front Door WAF Policy was switched from Prevention from Detection:

image

To troubleshoot, we used a browser’s Developer Tools to review the Network traffic and noticed the following error:

RequestMethod: Post
Status Code: 403
Referrer Policy: strict-origin-when-cross-origin

image

The above error was thrown when we opened up the website and navigated to sit.domain.com that presented a login page, entered our credentials into the form, clicked login, which would then send us to sitapp.domain.com. Clicking on the login button did not do anything as we would receive the 403 error as shown above.

Below is a screenshot of the log showing the origin and referrer URL, which is the login site we tried to log into:

image

Troubleshooting 

The first thought I had was that the referrer policy did not allow sit.domain.com to send a successful login to sitapp.domain.com so what we needed to do was whitelist the domain but this actually the wasn’t the case when we look into the logs.

There are no custom rules configured so we needed to determine which one of the managed rules has blocked the traffic:

image

In order to troubleshoot, Diagnostics must be turned on to capture the WebApplicationFirewall logs:

image

Verify that you are sending the FrontDoorWebApplicationFirewall Log to a log analytics workspace as shown here:

image

Once logging is turned on, wait 10 to 15 minutes for the log capturing to begin then proceed to replicate the issue. Note that you should see AzureDiagnostics displayed when logging has come into effect:

image

We should see entries with the following query when the WAF logs are captured:

AzureDiagnostics
| where Category == "FrontDoorWebApplicationFirewallLog"

image

Use the following query to retrieve all the entries representing blocks by a WAF rule:

AzureDiagnostics
| where action_s == "Block"

**Note that Block is case sensitive so “block” will not return any results.

image

Since we know have the information we collected from the browser network trace, we can use the information in there to look for the entry representing the block we’re experiencing. The values I usually use are requestUri_s or host_s but note that when using the requestUri_s, the URL will need to have a colon and port number after the domain as shown in the screenshot below:

https://sitapp.domain.com:443/webapi/api/account/login

image

Here is an example of adding the requestUri_s in the query:

AzureDiagnostics

| where action_s == "Block" and requestUri_s == https://sitapp.domain.com:443/webapi/api/account/login

With the entry located, review the ruleName_s value and note the ending number:

image

In this example, the rule that is blocking the traffic is 949110. However, this may not be locatable in the Managed rules of the WAF policy:

image

Although this is the rule that results in the traffic being blocked, it could be triggered by another rule. To determine the other related rules, we can locate the trackingReference_s value to look up the other logs related to this:

image

Use the following query to retrieve the entries:

AzureDiagnostics

| where trackingReference_s contains "0viJPYwAAAAD2qxqM/l3+QJ3xvYLEO0hYQ0hJMzBFREdFMDUxMwA0MzRlN2MwNC02M2NmLTRjMjMtYjFhOS00NDNjYmViZTJmNTg="

image

Expand the results and you should see another rule with the number 200002 that contains the same trackingReference_s as shown in the screenshot below:

image

Scrolling further down in the log will display more details about the reason why the traffic is blocked and in this case they are:

details_msg_s: Failed to parse request body.

details_data_s: %{reqbody_error_msg]

image

Searching for the 200002 number in the Managed rules will display the following rule with the description Failed to parse request body:

image

Unsure of what to make of the details, a case was opened with Microsoft but the engineer was unable to find any additional information for this error but was able to suggest that the following is what was happening:

When a request to sign into the https://sit.domain.com/ portal, the webpage directing the traffic to sitapp.domain.com is sending some strange or malformed data in the request body which causes the Front Door’s WAF policy to think there may be an attack. Their suggestion was to check the request body for any formatting error that would be triggering front door.

Workaround

I’ve requested the developer to look into this and for the time being changed the action for rule 200002 to log rather than block on anomaly so the portal login would start working:

imageimage

Not the best solution but serves as a workaround. Hope this helps anyone looking for information on how to troubleshoot Azure Front Door WAF policy prevention blocks.

Setting up BitTitan to migrate mailboxes from O365 to another O365 tenant fails source authentication with the error: "Your migration failed while checking source credentials. The request failed. The remote server returned an error: (401) Unauthorized."

$
0
0

It has been a while since I’ve been involved with Office 365 mail migrations but I was recently contacted by a colleague who was setting up BitTitan’s MigrationWiz for a migration and could not figure out why he was not able to authenticate so I hopped on to help. The search results from Google directed me to various documentation provided by BitTitan but none of them lead me to the right solution. I don’t usually use YouTube for troubleshooting as most of us probably feel it takes too long to watch a video as compared to reading a blog post like this but the following video is where I eventually found the answer:

How to solve BitTitan MigrationWiz Error 401 Unauthorized in 2022
https://www.youtube.com/watch?v=iI35AJrGYiw

This blog post serves to help anyone who might encounter the same problem quickly find the answer.

Problem

You attempt to use the Verify Credentials feature in MigrationWiz after setting up the source and destination tenants but receive the status: Failed (Verification)

image

Navigating into one of the accounts display the following error message:

Your migration failed while checking source credentials. The request failed. The remote server returned an error: (401) Unauthorized.

image

Solution

The reason why the environment I was troubleshooting in has this failure is because Microsoft had started disabling basic authentication for Office 365 that affects the EWS service that MigrationWiz relies on (3 minute mark in the video). This can be confusing because if you navigate to Settings > Org Settings > Modern Authentication, you’ll see that it states basic authentication is enabled for various services (including Exchange WebServices). The problem here is that this only applies to the modern Outlook client, which MigrationWiz isn’t.

image

To remediate this, click on the Help & support button at the bottom right corner of the administration console:

image

Then type in the following string to search:

diag: enable basic auth in exo

imageimage

Proceed to click on the Run Tests button:

image

Assuming basic authentication is disabled, we should be provided with a drop down menu box to select a service to enable:

image

Select Exchange Web Services (EWS) to enable the MigrationWiz dependent service, then click on Update Settings:

image

The following message will be displayed:

Run diagnostics
Basic authentication has been re-enabled for the selected protocol.

The Basic authentication blocked applications setting has been updated. You should be able to use Basic authentication with the selected protocol within the next hour.

image

Proceed to try and verify the credentials in an hour or so and the process should complete successfully.

Hope this helps as it took me a bit of time to figure this out.

Configuring an Azure Function App that uses a system managed identity to execute Az.Compute module cmdlets that will retrieve all Azure VMs with their Status then use a Logic App run the app and email the report

$
0
0

In this post, I would like to demonstrate the following using an Azure Function App and Logic App.

Function App:

Use the Az.Compute module to execute Get-AzVM to get the list of virtual machines and store it in an array

  1. Loop through the virtual machines and retrieve the name, resource group, location, vmsize, and os type
  2. Retrieve the VM status
  3. Store all fields in an array
  4. Create an HTML header, body
  5. Convert data into HTML format
  6. Return a HTML formatted email for delivery
  7. The Function App will use a System Assigned Managed Identity for authentication and authorization

Logic App:

  1. Set up a recurring Logic App that runs everyday
  2. Executes the Function App to retrieve the HTML formatted email report
  3. Send an email with the HTML formatted email report

Step #1 – Create a Function App that will retrieve the list of Virtual Machines, generate and return an HTML email report

Begin by creating a Function App that will retrieve Cylance Device List and return it in HTML format. This Function App collects the data that will in turn be call by a Logic App to generate an email and send the report off to an email address.

image

Proceed to create a Function App with the following parameters:

Publish: Code

Runtimestack: PowerShell Core

Version: 7.2

OperatingSystem: Windows

Configure the rest of the parameters as required by the environment.

image

image

With the Function App created, proceed to create the function trigger:

image

Select HTTP trigger as the template and provide a meaningful name:

image

With the trigger created, navigate to Code + Test and paste the following code into run.ps1:

https://github.com/terenceluk/Azure/blob/main/Function%20App/Get-AzureVMs.ps1

image

The following are changes you’ll need to apply to the code:

The client name:

image

Save the Function App and navigate back out to the Function App > App files, switch to the requirements.psd1, then add the following line to load the Az.Compute module, which will allow Get-AzVM to be executed:

'Az.Compute' = '5.*'

image

Save the file and navigate to the Identity blade then turn on the System assigned identity:

image

image

Once the system assigned managed identity is created, you should see the Function App created in the Enterprise applications:

image

Click on Azure role assignments while still in the Identity blade of the Function App:

image

Configure Reader permissions on the subscription containing the VMs:

image

With the Reader role granted, navigate back to the Function App and execute the Test/Run feature with HTTP method POST and without any body submitted:

image

You should see a HTTP response code 200 OK with the contents of your report displayed:

image

Step #2 – Create a Logic App that is scheduled to run every day to call the Azure Function App to retrieve the device list report and then send an email report out

With the Azure Function App created and tested, proceed to create the Logic App that will be scheduled to run every day to call the Azure Function App to retrieve the device list report and then send an email report out.

image

Navigate to the Logic app designer blade and begin to configure the steps for the Logic App. The following are the steps we’ll be configuring:

The first is the Recurrence step that will schedule this logic app to run at 9:00a.m. EST every day:

image

Create an additional step by clicking on the + button, select Add an action then type in Function, select the Function that was created:

image

We won’t need to pass a parameter so leave it unconfigured:

image

Proceed to create two additional steps:

  1. Initialize variable
  2. Set variable

These two steps will place the retrieved HTML report into the body of the email:

Initialize variable

Name: EmailBody
Type: String
Value: <leave blank>

image

Set variable

Name: EmailBody
Value: Select the Body

image

Configure the last step as Send an email (V2) that will email this report to the email address required:

image

Save the logic app and proceed to use the Run Trigger feature to execute the Logic App and confirm that the report is generated and sent:

image

One of the steps I did not include in this post is to secure the Function App to require authentication so allow the Logic App can execute it. Please see one of my previous posts for the steps:

Securing Azure Function App to require authentication and granting access to a Logic Apps’ Managed Identity
http://terenceluk.blogspot.com/2022/09/securing-azure-function-app-to-require.html

I hope this helps anyone who may be looking for instructions on how to configure automated reports with virtual machine details.

Installing and importing SharePoint Online Management Shell to remove a Deleted SharePoint site

$
0
0

I typically do not work within the SharePoint Online space but would periodically be asked to assist with administrative tasks and one of the issues I constantly face is not finding the right cmdlets to perform what I need to do but rather installing and importing the SharePoint Online Management Shell. My colleagues seem to encounter this as well so I thought I’d write a short blog post to outline the steps to get the module installed and demonstrate how to delete a deleted SharePoint site as an example.

Problem - Installing and Importing the SharePoint Online Management Shell

The official Microsoft document demonstrates the process of installing the module here:

Get started with SharePoint Online Management Shell
https://learn.microsoft.com/en-us/powershell/sharepoint/sharepoint-online/connect-sharepoint-online

The article provides the following cmdlets:

Get-Module -Name Microsoft.Online.SharePoint.PowerShell -ListAvailable | Select Name,Version

Install-Module -Name Microsoft.Online.SharePoint.PowerShell

What this article does not provide is the cmdlet to import the module after installing it as it will then provide the following cmdlet to connect to SharePoint Online:

Connect-SPOService -Url https://contoso-admin.sharepoint.com -Credential admin@contoso.com

Attempting to immediately connect will throw the following error:

Connect-SPOService: The term 'Connect-SPOService' is not recognized as a name of a cmdlet, function, script file, or executable program.

Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

image

Attempting to import the module with the following cmdlet will yield no output in the PowerShell console:

Import-Module Microsoft.Online.Sharepoint.Powershell –DisableNameChecking

image

Then trying to use the Connect-SPOService again will display the following error:

Connect-SPOService: The remote server returned an error: (400) Bad Request.

image

Solution

Before attempting to connect to SharePoint Online, the following cmdlet needs to be executed import the module:

Import-Module Microsoft.Online.Sharepoint.Powershell -UseWindowsPowerShell

image

Once authenticated and connected, you should now be able to execute cmdlets such as the following to remove a deleted SharePoint site that has its associated Microsoft 365 group removed:

Remove-SPODeletedSite -Identity https://contoso.sharepoint.com/sites/ContosoLtd

image

The site can’t be permanently deleted because it’s connected to a Microsoft 365 group

image

Hope this helps anyone who might be looking for a quick answer to install, import, and connect to SharePoint Online via PowerShell.

Viewing all 836 articles
Browse latest View live


Latest Images