A recent update to Android OS enables running a Debian Linux distribution on an Android phone. With a workable Linux distribution available on mobile, we now have the ability to install and run PowerShell in this Linux environment.
Google Pixel: The Linux terminal feature is still very new so Google Pixels are currently the only phones receiving the update. As with any other Android feature--non-Pixel Android phones will likely be receiving the update in the coming months.
March 2025 software update: The Linux terminal was included in the March 2025 Pixel update, so make sure to install the latest software update.
The Linux terminal can be enabled from Developer options. If you don't have Developer options enabled, follow these procedures to enable it.
Once enabled, navigate to Settings > System > Developer options. If you've received the latest update, you'll see the Linux development environment option under the Debugging section. Under this option, enable (Experimental) Run Linux terminal on Android.
Once the Linux development environment is enabled, open your app drawer and open the Terminal app--you'll see "Install Linux terminal". In the corner of the screen, click Install. The install will take a few minutes.
The Google Pixel runs on an ARM64-based processor, so we'll follow these procedures for installing PowerShell as a binary archive, as opposed to from a package manager like APT. Simply copy the code from the procedures and paste into the terminal.
Important
The example from the above procedures specifically references the x64 edition of the PowerShell binary (e.g.: powershell-7.5.0-linux-x64.tar.gz). The correct binary for arm64 processors is powershell-7.5.0-linux-arm64.tar.gz. The code below are the same procedures as provided by Microsoft, but for arm64 instead of x64, which will be compatible with Google Pixel:
Bash is the default shell for this terminal, but you can change your default shell by running chsh -s <shell binary> <username>. The below example will set PowerShell as the default shell for our user (droid):
The Linux feature is still in the experimental stage, and as such it can be pretty glitchy. Below are some tips to resolve any issues you may run into.
Enable notifications
The terminal displays a persistant notification while it's running.
As a result--when notifications are disabled--the app tends to act up. Ensuring notifications are enabled avoids some of these issues.
Pause the app
If the app is acting erratically or not responding, I've found that pausing the app can resolve some issues. Press and hold on the app icon and select Pause app. Then open the app again and when prompted, select Unpause app.
Recovery
If the app is still not acting properly or keeps crashing, you can reset the app's data by clicking the settings "gear" icon in the upper right corner, navigate to Recovery > Reset to initial version, and click Reset.
Warning
This will delete all data related to the Linux environment on the phone.
Re-enable the Linux environment
There are times when the app is acting up so much that Recovery isn't even an option. In this case, simply disabling, then re-enabling the Linux environment via Developer options (as described above) will reset the app.
Warning
As with the Recovery option, this will also delete all data related to the Linux environment on the phone.
With the recent explosion of AI and large language models (LLM), I've been brainstorming how to take advantage of AI capabilities within a CI/CD pipeline.
Most of the major AI providers have a REST API, so I could of course easily use that in a CI pipeline, but there are many situations where this isn't an option:
Cost: As many "AI wrapper" companies quickly discovered, these APIs are expensive. And running queries in a CI pipeline that could run potentially hundreds of times per day adds up quickly.
Security: Many organizations handling sensitive or proprietary data don't want their information sent to a third party like OpenAI or Google.
To solve these issues, I wanted to see if it's possible to run an LLM locally in a CI job, to which I can send queries without worrying about API cost or revealing sensitive data.
To start, you'll need either a GitHub or Gitlab account and you'll need to create your first repository12. Once that's done, create a basic CI/CD pipeline--we'll name it ci:
name:cion:push:
workflow:name:ci
This creates a basic structure for a pipeline that runs on all commits. To limit the pipeline to only run on a certain branch, modify GitHub's on.push option, or Gitlab's workflow:rules. For example:
The ollama CLI is great for running a local, interactive chat session in your terminal. But for a non-interactive, automated CI job it's best to interface with the Ollama API. To do this, we need to first define our ollama job and run Ollama as a service34 accessible by our job.
Next we'll add our script. When we request a response from the LLM we'll need to specify a large language model to generate that response. These models can be found in Ollama's library. Any model will work, but keep in mind that models with more parameters--while providing much better responses--are much larger in size. The 671 billion parameter version of deepseek-r1, for example, is 404GB in size. As such, it's ideal to use smaller models such as Meta's llama3.2.
Prior to generating a response, we'll first need to pull the model we want using Ollama's pull API. Then we generate the response with the generate API. Any Docker image will work for this job as long as it has the ability to send web requests with tools like wget or curl. For this example we'll be using curl with the alpine/curl image.
container:alpine/curlsteps:-name:Generate responserun:|curl -sS -X POST -d '{"model":"llama3.2","stream":false}' ollama:11434/api/pullcurl -sS -X POST -d '{"model":"llama3.2","stream":false,"prompt":"Hello world"}' ollama:11434/api/generate
image:alpine/curlscript:|curl -sS -X POST -d '{"model":"llama3.2","stream":false}' ollama:11434/api/pullcurl -sS -X POST -d '{"model":"llama3.2","stream":false,"prompt":"Hello world"}' ollama:11434/api/generate
Note
Ideally, the pull and generate operations would run in separate steps. GitHub uses the steps functionality for this, however, the comparable functionality in Gitlab (run) is still in the experimental stage. For simplicity for the sake of this article, we'll be running the commands in a single script in both GitHub and Gitlab.
To accomplish the same in separate steps would look like this:
container:alpine/curlsteps:-name:Pull modelrun:curl -sS -X POST -d '{"model":"llama3.2","stream":false}' ollama:11434/api/pull-name:Generate responserun:curl -sS -X POST -d '{"model":"llama3.2","stream":false,"prompt":"Hello world"}' ollama:11434/api/generate
image:alpine/curlrun:-name:Pull modelscript:curl -sS -X POST -d '{"model":"llama3.2","stream":false}' ollama:11434/api/pull-name:Generate responsescript:curl -sS -X POST -d '{"model":"llama3.2","stream":false,"prompt":"Hello world"}' ollama:11434/api/generate
That's all we need--let's see the response:
> curl -sS -X POST -d '{"model":"llama3.2","stream":false}' ollama:11434/api/pull
{"status":"success"}
> curl -sS -X POST -d '{"model":"llama3.2","stream":false,"prompt":"Hello world"}' ollama:11434/api/generate
{"model":"llama3.2","created_at":"2025-02-06T18:46:52.362892453Z","response":"Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?","done":true,"done_reason":"stop","context":[128004,9125,128007,276,39766,3303,33025,2696,22,8790,220,2366,11,271,128009,128006,882,128007,271,9906,1917,128009,128006,78191,128007,271,9906,0,1102,596,6555,311,3449,499,13,2209,1070,2555,358,649,1520,499,449,477,1053,499,1093,311,6369,30],"total_duration":9728821911,"load_duration":2319403269,"prompt_eval_count":27,"prompt_eval_duration":3406000000,"eval_count":25,"eval_duration":4001000000}
> curl -sS -X POST -d '{"model":"llama3.2","stream":false}' ollama:11434/api/pull | jq -r .status
success
> curl -sS -X POST -d '{"model":"llama3.2","stream":false,"prompt":"Hello world"}' ollama:11434/api/generate | jq -r .response
Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?
With just a few lines of code, we're able to run an Ollama server, pull down a large language model, and generate responses--all completely local to our CI job. We can now use this capability to generate release notes, automate code review, write documentation--the possibilities are endless.
Have you ever needed to pass a number of parameter values from one function into another function with some (but not all) of the same parameters? I've run into this particular pain point multiple times when writing functions in the past so I decided to find somewhat of a workaround.
If you've spent some time writing PowerShell functions, you may be familiar with the $PSBoundParameters automatic variable. As Microsoft defines it, this variable
Quote
Contains a dictionary of the parameters that are passed to a script or function and their current values.
Let's take a look at the below example. We'll write a simple function to return the contents of the $PSBoundParameters variable:
PS >functionGet-BoundParameters{param($Parameter1,$Parameter2,$Parameter3)$PSBoundParameters}PS >Get-BoundParameters-Parameter1'this is param1'-Parameter3'this is param3'KeyValue--------Parameter1thisisparam1Parameter3thisisparam3
As you can see, the $PSBoundParameters variable contains the parameters Parameter1 and Parameter3 and their contents, but does not contain Parameter2 since I never used that parameter.
$PSBoundParameters comes in handy quite often for things like checking whether a certain parameter has been used:
PS >functionGet-Foods{param([Parameter(Mandatory)]$Fruits,$Vegetables)$message="Here are the fruits"if($PSBoundParameters.ContainsKey('Vegetables')){$message+=" and vegetables"}Write-Output$message$Fruits+$Vegetables|ForEach-Object{"- $_"}}PS >Get-Foods-Fruitsapple,orange-Vegetablescarrot,celeryHerearethefruitsandvegetables-apple-orange-carrot-celery
In this example, we use $PSBoundParameters to check if it contains the key 'Vegetables' (meaning the $Vegetables parameter was used), and if it does we add the phrase " and vegetables" to the end of the return message.
Now that we understand how $PSBoundParameters works, let's examine where it falls short.
While $PSBoundParameters is great, the issue is that it only contains bound parameters (as the name suggests). This means that if a parameter has a default value the default value will never be included in the $PSBoundParameters dictionary.
I've run into this situation many times when writing functions that pass certain parameter key/values to other commands, such as this example:
Splatting
In the Get-UserItemsParent function, we use a method called splatting when calling the Get-UserItems function. Splatting is a way to pass all parameters and values to a command as a dictionary instead of writing them out the long way. As an example, this:
Note that when splatting, the dictionary variable (in this case, $parameters) is written with an @ sign instead of a $.
If you're unfamiliar with splatting, I highly recommend reading the documentation to learn how you can take advantage of it in your scripts. I'll likely cover it and more ways to use it in a future post.
PS >functionGet-UserItems{param($Name,$Items)Write-Output"$Name has the following items:"foreach($Itemin$Items){"- $Item"}}PS >functionGet-UserItemsParent{param($Name="Bob",$Items)Get-UserItems@PSBoundParameters}PS >Get-UserItemsParent-Itemsapple,orange,carrot,celeryhasthefollowingitems:-apple-orange-carrot-celery
In the above example you'll notice that, while the fruits and vegetables in the $Items variable were passed on from the Get-UserItemsParent function to the Get-UserItems function via the $PSBoundParameters variable, the name Bob was not because "Bob" is the default value of the $Name parameter, but that parameter wasn't actually used by the user and as such is not part of $PSBoundParameters. This is the dilemma we're here to solve.
What we're really looking for is akin to a $PSAssignedParameters variable, which in theory would contain assigned parameters (i.e. any parameters with values, whether from the user or from default values). Unfortunately, this isn't a real variable (at least not yet), but the below code snippet is a suitable workaround:
The $MyInvocation automatic variable and its MyCommand property represents the command that's currently running. Creating a new [CommandMetaData] object with the current command allows us to find all parameters available for the command by accessing the Parameters property. These parameters and information about the parameters are stored as a dictionary. Finally, GetEnumerator() allows us to iterate ("loop") through each of the parameters in the dictionary.
$var=Get-Variable-Name$_.key-ValueOnly
While looping through the parameters, we use the Get-Variable cmdlet to get the value of each parameter.
if($var){$PSAssignedParameters[$_.key]=$var}
If the parameter value ($var) is not empty, add it to the $PSAssignedParameters hashtable.
And that's it! All we need are 7 lines to get all parameters with assigned values. In the next section I'll discuss how we can build this into a function for reusability.
The function can be used as-is, but let's see if we can improve it. It occurred to me that, while getting the assigned parameters is the goal, there are times when we may want to include or exclude specific parameters. Instead of repeatedly writing code in our scripts to remove unwanted keys from $PSAssignedParameters, let's add that functionality to the function.
Here I added the $Include and $Exclude parameters along with the corresponding logic:
- If $Include is used, only return parameters in the $Include array
- If $Exclude is used, only return parameters not in the $Exclude array
While we'd hope nobody would try to use the $Include and $Exclude parameters at the same time, we'll want to follow PowerShell best practices and ensure our function can't be used in unintended ways. To accomplish this, we'll use Parameter Sets:
By defining the $Include and $Exclude parameters as two different parameter sets, we allow PowerShell to do the work for us. As we can see when we run Get-Help against our function:
The two parameters are part of separate parameter sets and thus it's impossible to use both parameters at the same time.
Finally, we'll add our CmdletBinding and some comments:
functionGet-AssignedParameter{<#.SYNOPSIS Gets all parameters with assigned values..DESCRIPTION This function returns any parameters from a provided invocation with assigned values--whether that be bound parameter values provided by the user, or default values..PARAMETER Invocation The invocation from which to find the parameters. Typically this will be the automatic variable `$MyInvocation` within a function or script..PARAMETER Include A string array of parameter names to include in the returned result. If this parameter is used, only parameters in this list will be returned..PARAMETER Exclude A string array of parameter names to exclude from the returned result. If this parameter is used, any parameters in this list will not be returned..OUTPUTS System.Collections.Hashtable.LINK https://DevOpsJeremy.github.io/documentation/powershell/Get-AssignedParameter.html.LINK Getting Assigned Parameters in PowerShell: https://devopsjeremy.github.io/powershell/2023/10/16/getting-assigned-parameters.html.EXAMPLE Get-AssignedParameter -Invocation $MyInvocation Gets any assigned parameter key/values..EXAMPLE Get-AssignedParameter -Invocation $MyInvocation -Include Name,Status Gets the 'Name' and 'Status' parameter key/values if they are assigned..EXAMPLE Get-AssignedParameter -Invocation $MyInvocation -Exclude ComputerName Gets any parameter key/values which are assigned, excluding the 'ComputerName' parameter. #>[CmdletBinding(DefaultParameterSetName='Exclude')]param([System.Management.Automation.InvocationInfo]$Invocation,[Parameter(ParameterSetName='Include')][string[]]$Include,[Parameter(ParameterSetName='Exclude')][string[]]$Exclude)$PSAssignedParameters=@{}[System.Management.Automation.CommandMetaData]::new($Invocation.MyCommand).Parameters.GetEnumerator()|ForEach-Object{if($Include){if($_.key-in$Include){$var=Get-Variable-Name$_.key-ValueOnlyif($var){$PSAssignedParameters[$_.key]=$var}}}elseif($_.key-notin$Exclude){$var=Get-Variable-Name$_.key-ValueOnlyif($var){$PSAssignedParameters[$_.key]=$var}}}$PSAssignedParameters.Clone()}
And now our function is complete! Let's test it out using our function from earlier:
PS >functionGet-UserItems{param($Name,$Items)Write-Output"$Name has the following items:"foreach($Itemin$Items){"- $Item"}}PS >functionGet-UserItemsParent{param($Name="Bob",$Items)$PSAssignedParameters=Get-AssignedParameters-Invocation$MyInvocationGet-UserItems@PSAssignedParameters}PS >Get-UserItemsParent-Itemsapple,orange,carrot,celeryBobhasthefollowingitems:-apple-orange-carrot-celery
As we can see, not only did the $Items parameter get passed to the child function, but so did the default value of $Name.
I hope you found this article helpful--be sure to follow the socials below to keep up with future posts.