-
Notifications
You must be signed in to change notification settings - Fork 448
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature planning: first class PowerShell support #337
Comments
Need to make sure that new model solves the log streaming issue described in this thread. |
@christopheranderson first sentence talks about both PowerShell & Python. Copy/paste error I assume? :) |
Yes. That's what happened. Nice catch. |
I noticed the current version runs on PowerShell v4. I'd really like to see that get bumped to V5. |
Need to have a means of reporting a failed execution. Related issue: #371 |
Need docs around external .dlls/modules/etc. for powershell - Related issue: #372 |
@tohling Does your new PowerShell support address the issues @trondhindenes is raising? |
Per @christopheranderson's point, I would love more docs. Currently experiencing problems with 1) executing an exe and b) using a module |
I'm a bit confused. This is clearly not First Class PowerShell support -- it appears that you've created a way to support "any executable" by routing IO through the file system. That is, it looks like you're trying to create a single method for providing minimal support for PowerShell, Python, Perl, Ruby, et. al. Compared to, say, the C# implementation, there's nothing "first class" about this -- you might as well just have a "command-line app" support and let us provide files and command lines. Why can't you use the PowerShell API, host it, and pass a parameter with the request object, collecting the output the same as you would from C# (example in the closed 309 above). |
@Jaykul, your observation is correct. This feature is intended to support customers who want to write Azure Functions using PowerShell cmdlets. As such, the user experience will try to mimic as close as possible the command-line and scripting workflows on the PowerShell.exe console. When you mentioned your preference for PowerShell API, are you referring to using System.Management.Automation.dll and writing C# code that calls PowerShell APIs, as shown in the example here)? If that is the case, you can author your code in a C# Function. Here is a simplified example of a C# HTTP-triggered Function that uses the PowerShell API. Sample code:
Sample log output:
Sample HTTP response message: |
I was thinking more about azure doing the C# part, and letting people write PowerShell functions that would, to quote you:
As a general rule, when people are writing functions for the PowerShell console, it would be an anti-pattern to use file IO for parameters or output. Instead, functions should take parameters and output objects. In my ideal world, your sample PowerShell template function would look more like this: param($request)
Write-Verbose "PowerShell HTTP trigger function processed a request. RequestUri=$($req.RequestUri)"
<# ... do stuff #>
$name = $request.GetQueryNameValuePairs().Where({$_.Key -eq "Name" }).Value
return $req.CreateResponse("OK". "Hello $name"); But obviously that requires some code a little like what you wrote in your C# example -- I don't think people should have to write that themselves nor settle for having to serialize and deserialize through JSon on disk ... |
@Jaykul, thank you for the clarification. I understand now that your expectation was that the user experience would be more akin to a PowerShell function. The term "PowerShell function" has a plural context at this point. We will keep this in mind and update our documentation to make the distinction clearer. Unfortunately, the ideal workflow you suggested is not a supported scenario at this 'Experimental' stage. We appreciate this feedback and will add it as a consideration for our future planning. |
@aharpervc, thank you for your input. We have added this request to our list and hope to support it in our future release. |
Just throw in my 2c. I write PowerShell to work with SharePoint and Office 365. These are scripts originally intended to be run from a client PC. I then upload them (with assemblies & modules) to Azure Function and they run just as they would locally, but now in the cloud. So, I think mirroring this use case for experimental stage is perfect. It works exactly as I expect. Whether or not I get pure PowerShell Functions with proper object streaming is almost secondary. I think as long as the documentation makes it clear what's being passed in/out I think it's fine. Would it be possible in v2 to support object streaming via a different binding syntax? A different point - how would one call one PowerShell azure-function from another? |
@johnnliu, thank you for trying out PowerShell in Functions. Q1: Would it be possible in v2 to support object streaming via a different binding syntax? Q2: A different point - how would one call one PowerShell azure-function from another? |
Q2: calling Azure Function from another Azure Function is a conversation for another thread I think. Q1: object streaming as parameter or output is the same issue that was raised earlier by @Jaykul - PowerShell, in contrast to Bash, promotes one script to pipe object natively to the next script. So in powershell, $req should be the request object. Not the path to a temporary file that holds the contents of the request. My thinking is that this isn't top priority for me personally in experimental, but wondering aloud if this can be added later. The trouble is, similar to when $env:req was renamed to $req - lots of Powershell scripts will break if this changes. |
@johnnliu, thank you for the clarification. Yes, we will be adding that to our list of items to support in future updates. |
Closing this as all the work is now tracked at https://github.com/azure/azure-functions-powershell-worker |
This is a tracking item for First Class PowerShell support. We'll be tracking the requirements and issues for first class PowerShell support via this item. Feel free to leave comments about features/design patterns.
Function format
For the Azure Function PowerShell format, we'll keep the existing "scripting" model and limit the run file to the .ps1 format. A future feature will address .psm1 support with a proper "function" model.
This means we use the existing pattern of communicating data via files & environment variables. It also means that we don't "cache" the environment that we run the script from. This will mean an inherent performance overhead, but this is likely acceptable for the scenarios where PowerShell scripting will be used. For more advanced scenarios, we'll need to address those issues with psm1 support.
The scripting format, as is looks as so:
Breaking this down, data coming in (via triggers and input bindings) is passed along via files which are communicated via environment variables, the names of which derive from the
name
property of the corresponding binding in the function.json. Data out works the same way. Any data being sent to output bindings is output to a local file specified via the environment variable corresponding to thename
parameter in the function.json for the corresponding output bindings.Data Type formats
All data is transferred via files. This means that it's up to the user to parse the file to the right format.
Assuming the user knows which format the data in the file is in, all formats should be supportable.
[string]$str = Get-Contents $Env:input
$str > $Env:output
[int]$int = Get-Contents $Env:input
$int > $Env:output
[bool]$bool = [int](Get-Contents $Env:input)
$bool > $Env:output
$json = Get-Content $Env:input | ConvertFrom-Json
$json | ConvertTo-Json > output.txt
[byte[]] $byte = Get-Content .$Env:input -Encoding Byte
$byte | Set-Content $Env:output -Encoding Byte
$reader = [System.IO.File]::OpenText($Env:input)
$writer = [System.IO.StreamWriter] $Env:output
Version & Package management
TBD
Testing/CI
TBD
Change log
The text was updated successfully, but these errors were encountered: