If there is one thing I love it is automation. If you are a systems administrator, you have probably at some point needed to create some reports using PowerShell. Usually you would do this manually and export this to a CSV them hand off the report in an email or through SharePoint ect. But how can we automate this process?

While you could run PowerShell scripts as a scheduled task to fire off an email which is quick and dirty. But is there a more elegant solution? In comes Power Automate Flows.

First I am going to create an Azure Storage Table to hold our report data. Why? While this is not 100% necessary, I like the idea of retaining the data for future use, or wowing the client/boss with a PowerBi report. Log into Azure and create a v2 storage account. Go to “Tables” and create a table. Name it whatever you desire.

Here I have created a couple tables to hold different reports.

Now lets log into Power Automate and create our flow.

Select “New Flow” then “Automated Cloud Flow”. Click Skip without entering a trigger. We will do this manually

Now lets create our trigger. We will use “When a HTTP request is recieved”

Now we need to generate a JSON schema. To do this run your powershell report. Here is the report I will be using

Connect-AzureAD
$TenantID = Get-AzureADTenantDetail | select -ExpandProperty ObjectID
$users = Get-AzureADUser -all $true | where assignedlicenses -ne $null

$list = [System.Collections.Generic.List[PsObject]]::new()
foreach ($user in $users)
{
	$licenses = ($user).assignedlicenses.skuid | foreach { (Get-AzureADSubscribedSku -ObjectID ($($TenantID) + '_' + $_)).skupartnumber }
	$lic = $licenses -join ' '
	$licComma = $lic -replace ' ', ','
	$list.add([PSCustomObject]@{
			Name = $user.displayname
			Email = $user.UserPrincipalName
			licenses = $licComma
			StreetAddress = $user.streetaddress
		})
}

This report will generate a list of licensed users with their license details and location. To generate our sample JSON payload, just convert the output to JSON:

$list | ConvertTo-Json

Grab the output and slap it into the sample JSON payload. Heres what the schema looks like this:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Name": {
                "type": "string"
            },
            "Email": {
                "type": "string"
            },
            "licenses": {
                "type": "string"
            },
            "StreetAddress": {
                "type": "string"
            }
        },
        "required": [
            "Name",
            "Email",
            "licenses",
            "StreetAddress"
        ]
    }
}

Now Click next step. Search for “Azure Table Storage”. Next, setup the connection.

Set the connection name to something descriptive. The Storage account name is the exact name as its displayed on your account. The storage key is found under Settings / Access Keys

Next you’ll be asked to enter a partition Key, Row Key and Entity. For parition key we will use partition1 (this can be anything you want). For row key we will use “Name”. For entity select “Current Item”

At this point our flow is good enough to get data into our table. Go ahead and click Save.

Once saved, click on the first step “When a HTTP request is received”. Copy the HTTP Post URL.

Now, over in our script lets add what we need to POST the data. We need to generate a header and use invoke rest method:

$header = @{ 'Content-Type' = "application/json" }
$flowheader = $list
$flow = "https://prod-133.westus.logic.azure.com_ect"

Invoke-RestMethod -Method Post -Body $($flowheader | ConvertTo-Json -Compress) -uri $flow -Headers $header

Our header specifies that we are sending “application/json” data. If we do not include this, our POST operation will fail. The uri parameter of invoke rest method is the URL we copied from our first flow step.

Ok to test we just run our entire script. You can check My flows to see if the status was a success.

Now go over to your Azure storage account. Use the Storage explorer to confirm the table was filled.

Now we see it was a success! Our table is filled with the output of our PS script. We can create more steps in our flow to take this data and post it to teams, send an email, upload to SharePoint ect.

I will quickly demo how to send this to a email as an HTML table. Lets go back to our flow and add some more steps. Select New Step and add “Get Entities”

If you want to filter based on property criteria, you can use and OData filter query. An example would be StreetAddress eq '1234 Fake Street'

This query would only output users with the street address parameter of ‘1234 Fake Street’.

Select New Step and use “Initialize Variable”. Set name to ‘Table’, type to ‘Array’. Value is ‘Get entities result list of entities’ from our dynamic content

Next, we will create an HTML table. From use the dynamic content of the output from our array “Table”

The next step is optional. If you only want to fill certain properties from the array to our HTML table, create a step “Data operation - Select”

example:

From - Table Map : Name | item()?[‘Name’]

Map : Email | item()?[‘Email’]

Input the selection values inter the expression field as shown:

Last we will generate an email with the HTML table we created.

Create a new step “Send an Email”. In body select dynamic content “Output” . Alternatively you could put the “Output” in attachments.

And that’s it! You should have a good foundation here to create some nifty flows. Combine this with Azure Automation to run the reports on a schedule.

Credit to the Intune Training youtube channel for their amazing example Using Power Automate Flows to Collect Custom Client Inventory - Part 1