-
Posts for year 2014, total posts 59 / 59 migrated
- $remotefile - the URL to the remote file we want to download
- o localpath - the place we want to save the file
- no-check-certificate - this one was needed as WordPress didnât have a signed cert, or something to that effect
- secure-protocol - let wGet handle figuring out which protocol to use
- keep-session-cookies - yep, we need to preserve the data for our session
- u - Useragent, weâre spoofing a Mozilla user agent, as WordPress didnât like me using a PowerShell scripting agent
- load-cookies - the path to the cookies.txt file we exported earlier
- âExport Cookiesâ or the wGet utility. Export Cookies - https://addons.mozilla.org/en-US/firefox/addon/export-cookies/
- How to use WGET to download from a site with a login - http://askubuntu.com/questions/161778/how-do-i-use-wget-curl-to-download-from-a-site-i-am-logged-into
- Get wGet -Â http://gnuwin32.sourceforge.net/packages/wget.htm
- 7Zip
- NotePad++
- Google Chrome
- Java
- Flash
- VLC Player
- Microsoft C++
- Puddy
- Fiddler
- DotNet Framework
- Virtual Box
- Virtual Clone Drive
- FoxIT
- DSC Push â A system is built and a devop/admin/sysop pushes a config to the machine by logging on locally and pushing the config down to the system OR â A system is built and then an outside construct pushes the DSC config down to the system remotely, this could be a runbook or some other flavor of Service Management Automation (SMA)
- DSC Pull â A system is built and then instructed by some mechanism to reach out to an SMB Share or IIS server which is configured as a DSC Pull Server, and the system downloads a configuration from there.
- The most up-to-date WMF pack
- The DSC resources here. Copy the DSC Resources to C:ProgramFilesWindowsPowerShellModules.
- Finally, run the script youâll find on Jason Helmickâs post.
- Ensure a DSC configuration (a .mof file) exists for our newly built machine
- Generate a GUID for our newly built machine, rename our .mof from $nodename to Guid.mof
- Assign a checksum to our config and ensure the extension is .mof.checksum (using the New-DSCCheckcmdlet)
- Tell our newly built machine to pull down a configuration from a pull server using the Guid and relevant Pull Server settings
- With SCCMâs little brother, Microsoft Deployment Toolkit  and its Lite Touch Install
- Use Windows System Image Manager to configure an unattended.xml file with system first boot actionlook for a pull server (setup DNS alias for a pull server)
- Use Group Policy to run a log-on script to run this PowerShell
- If you make use of another system building tool, use your Orchestration framework to Invoke this script and impart the DSC pull setting back down.
All Posts for 2014
My Site - 2014 in review

The WordPress.com stats helper monkeys prepared a 2014 annual report for this blog.
Hereâs an excerpt:
The Louvre Museum has 8.5 million visitors per year. This blog was viewed about 76,000 times in 2014. If it were an exhibit at the Louvre Museum, it would take about 3 days for that many people to see it.
Click here to see the complete report.
Continue Reading...Unholy Matrimony - wGet and PowerShell together

XML has been around for a LONG time. But just like working with HTML, it still kind of stinks. If you want to reach into a file and pull out values within certain tags, youâd better become a pro with Xpath or be prepared to create some REALLY ugly Regex. For instance, if we wanted to grab the values within the tag we care about for this blog post, the regex would be this simple little number.
<wp\\\\:attachment\_url\\b\[^>\]\*>(.\*?)</wp\\\\:attachment\_url>
You know what they say about using regexâŚ
(http://xkcd.com/1171/)
Fortunately for us, PowerShell makes it very easy to work with XML and pull out properties, and makes it super easy to do so if you take advantage of the built-in .net System.Xml.XmlDocument object type to help you parse itâs values.
For this example, I wanted to make a full backup of my family Wordpress Photo site, but there was no easy way to download all of the photos from my blog posts, which is a necessity to transfer my content to another provider, or at least to maintain my own copy of all of my work. (Note to Wordpress.com engineers, it would be totally sweet to have the option to download my entire blog in one nice little .zip!). So if, youâre curious about how to backup your WordPress blog including all media, read further!
How to Export your files from WordPress
So, thereâs no option to download all of the media, but there is the option to download a Blogtitle.XML file, which you can then import into other blog providers. Â Go to your Blog Dashboard, then Settings->Export.
[] At the time I took this screen shot, I had a different working title for this post
Â
Scraping XML
This XML file describes the overall layout of our site, including posts, forum replies, and things like that. Â It also provides the full URL of all of the media actually featured within any of your posts. Â If weâre good with XML parsing, we can scrape this document and pull out the URLs to all of our media featured anywhere on our site. Â Hereâs an example of what the XML looks like, showing the layout of a normal image post on a photo centric blog.
So, Now that weâve got an XML file, letâs get into which commands we have available to work with XML. Â If you were to do a search for Commands in PowerShell that support or have the word XML in their name, youâll only find these few:
Letâs stay away from Import-CLIXml; down that path lies only sorrow. Â This cmdlet is the completely wrong command to use for parsing someone elseâs XML files. See, these CliXML commands really expect you to take PowerShell Objects and export them using Export-CLIXML, and then to later reimport them using Import-CLIXML. Â It would work for a lot of things, but it not the right command to use to Import someone elseâs XML.
Instead, to import those files, weâll simply take the content of the XML file, store it in a variable (in this case $xml) and be sure to cast that as [XML] in the process.
#Cast as \[xml\] while creating the variable
[xml]$xml = Get-Content "C:\\Users\\Stephen\\Downloads\\theowenden.wordpress.2014-12-17.xml"
And once thatâs done a quick, Get-Member will show us that now PowerShell is parsing the content of this file successfully.
So many methods! Thank you PowerShell for making this easy!
Weâre looking for a method to let us find certain elements based on the name of their tag  (the âwp:attachment_urlâ tag).  Looking through the list, one jumps out at me.  The method in question weâll use is $_.GetElementsByTagName().  Our next step is to take a look at the XML file itself and see which tag contains the data weâre interested in.
Hereâs a screen shot of my .xml file, Iâve highlighted the tagname weâre interested in.
Yep, same screen shot as before. A rarely seen example of code reuse in the wild!
Â
Now, letâs provide the name of the tag we want as the overload of our .Method(). If we get all a lot of URLs for images, weâre good to go.
It should only take a short PowerShell hop to prepare a skeleton to download each file, weâre almost done!
#Cast as \[xml\] while creating the variable \[xml\]$xml = gc "C:\\Users\\Stephen\\Downloads\\theowenden.wordpress.2014-12-17.xml"
$count = $xml.GetElementsByTagName("wp:attachment\_url").Count $xml.GetElementsByTagName("wp:attachment\_url") | %{ "Downloading $i of $count" $remoteFile = $\_."#text" $destination = "C:\\Users\\Stephen\\Dropbox\\Photos\\Catherine\\$($remoteFile.Split('/')\[5\])" $destination
#ToDo: #Figure out how to download the file...should be easy, right? $i++ }
The wrinkle
I totally couldnât figure out how to do this next part in PowerShell.
I tried every variation under the sun to do this natively in PowerShell, and was never able to succeed past the WordPress logon page to download these images. Â I could logon (Using the FB example), but maintaining my session and using that to execute a download as an authenticated user? Â No workey. Â I tried the FaceBook method from the Invoke-WebRequest help, I tried making a local $Form.Fields object and populating the values, I tried the System.Net.WebCLient class. Â Like Thomas Edison, I merely found 24 ways that wouldnât work.
I even went so far as to install Fiddler and proxy my Invoke-WebRequests through it to see why in the world I couldnât maintain my session after passing authentication. Â I gave the form my username and password, what more could it want from me?
Unholy matrimony, wGet and PowerShell together
I took a different approach. I already had the framework in place to pull the image URLs out of the XML file. I had the skeleton there to act on each and save them locally. The last missing piece was a mechanism to handle downloading the files and I knew that the mystical wGet utility from Linux can handle pretty much anything under the sun.
I did it, I merged PowerShell code to automate commands going to wGet. Iâm not proud of it, but it worked!
From previous experience, I knew that you could provide a cookies.txt file to wGet and it could âJust workâ and pick the right cookie for the right site. Â I used the awesome Firefox plug-in âExport Cookiesâ to dump all of my cookies into a C:\temp\cookies.txt file. If you open up the cookies.txt, you can see a LOT of session data. Â You can just cull down to the site you need to perform your task, but I was lazy and provided my full 2KB cookies.txt file to wGet.
Basically, enable the plug-in, go to WordPress and login. Then hit alt on the keyboard to display the toolbar, and choose the âExport Cookiesâ option.
Awww one of the many photos I wanted to download!
Youâll get a cookies.txt. Â Now, pass this in your wGet âload-cookies parameter, and watch the magic. Â To execute this command, youâll need to provide the following params to wget.
wget.exe remoteFile -O localpath --no-check-certificate --secure-protocol='auto' --keep-session-cookies -U "Mozilla" --load-cookies c:\\temp\\cookies.txt
Start with downloading just a single file!
$xml.GetElementsByTagName("wp:attachment\_url") | Select -first 1 | %{
$remoteFile = $_."#text"
$destination = "C:\Users\Stephen\Dropbox\Photos\Catherine\$($remoteFile.Split('/')[5])"
wget.exe --load-cookies c:\temp\cookies.txt $remoteFile -O $destination --no-check-certificate --secure-protocol='auto' --keep-session-cookies -U "Mozilla"
The output:
Resolving 1redfamily.files.wordpress.com.
192.0.72.3
Connecting to 1redfamily.files.wordpress.com|192.0.72.3|:443.
connected.
WARNING: cannot verify 1redfamily.files.wordpress.com's certificate, issued by `/C=US/ST=Arizona/L=Scottsdale/O=GoDaddy.com, Inc./OU=http://certs.godaddy.com/repository//CN=Go Daddy Secure Certificate Authority -
G2': Unable to locally verify the issuer's authority.
HTTP request sent, awaiting response.
200 OK
Length: 6766012 (6.5M) [image/jpeg]
Saving to: `C:/Users/Stephen/Dropbox/Photos/Catherine/img_5194.jpg'
98% 692K 0s 6500K .
99% 705K 0s 6550K
100% COMPLETE
Yay! If one worked, then the same should be true for all of the rest! Â Hereâs the code!
CODE
Just a warningâŚthe output from this will be VERY ugly if you run it from within the ISE. The ISE does not appreciate wGetâs method of displaying process output to the console window. If any of you can figure out how to preserve a session for a php-based logon server and to do it with only PowerShell, please let me know!
#Cast as [xml] while creating the variable
[xml]$xml = gc "C:\Users\Stephen\Downloads\theowenden.wordpress.2014-12-17.xml"
$count = $xml.GetElementsByTagName("wp:attachment_url").Count
$xml.GetElementsByTagName("wp:attachment_url") | %{
"Downloading $i of $count"
$remoteFile = $_."#text"
$destination = "C:\Users\Stephen\Dropbox\Photos\Catherine\$($remoteFile.Split('/')[5])"
$destination
#ToDo:
#Figure out how to download the file...should be easy, right? - COMPLETED!
wget.exe --load-cookies c:\temp\cookies.txt $remoteFile -O $destination --no-check-certificate --secure-protocol='auto' --keep-session-cookies -U "Mozilla"
$i++
}
Sources
I couldnât have completed this job without the help of the forum post from the Ubuntu discussion board
PowerShell Version 5, What's new!

PowerShell native switch configuration
Iâm not going to dig into this too deeply, instead, read Jeffrey Snoverâs great post on the topic here.
APT-GET comes to PowerShell!
The Coolest new feature is OneGet, which is PowerShellâs adaptation of the community-based software repository Chocolatey. Chocolatey supports a tremendous catalog of software, allowing you to silently install software directly from the command line. Some examples of software found in the Chocalatey gallery:
You can see a full catalog of software here, http://chocolatey.org/packages.
 Sample of using OneGet to install packages
First and foremost, youâll need to temporarily allow remote script execution in order to use this version of OneGet. That is because behind the scenes to install a program using OneGet, PowerShell will download a Chocolatey install script and execute it, and if your execution policy prohibits it from running, you wonât be having any fun. To get started, first install WMF 5.0, available here. This may or may not require a restart for you. Now, launch PowerShell and check out the version of PowerShell youâre running with Get-Host.
Aw, yeahâŚUpgrayedd
Now, letâs Import the OneGet module and see what new commands are available.  PowerShell exposes some very nice functionality here. Out of the box, weâre able to add our own corporate PackageSource repository, and do some other interesting things:
Command | Purpose |
Add-PackageSource | Add your own Package Source other than Chocolatey |
Find-Package | Search your package sources for software |
Get-Package | Get information about packages installed locally |
Get-PackageSource | Get a listing of Package Sources available |
Install-Package | Install a package |
Remove-PackageSource | Remove a Package Source |
Uninstall-Package | Uninstall a package from your system |
Letâs say that we needed a tool to work with PDFs, and had never heard of Adobe before. We might run Find-Package, and pipe that into Where-Object to filter.
[] You could potentially discover software to install from the command line.
Letâs choose Foxit Reader. Remember when I said to allow script execution? Well this is why. If you try to install without specifying this, youâll get the following error.
The install script canât run if you donât allow for UnSigned Scripts during your Install-Package session
This is what is really happening when you use OneGet to install FoxitReader. PowerShell first downloads the configuration script (C:\Chocalatey\lib\FoxitReader\tools\ChocolateyInstall.ps1) that looks like this:
 Install-ChocolateyPackage 'FoxitReader6010.1213\_L10N\_Setup.exe' 'EXE' '/quiet' '[http://cdn01.foxitsoftware.com/pub/foxit/reader/desktop/win/6.x/6.0/L10N/FoxitReader6010.1213\_L10N\_Setup.exe](http://cdn01.foxitsoftware.com/pub/foxit/reader/desktop/win/6.x/6.0/L10N/FoxitReader6010.1213_L10N_Setup.exe)'
Which as you can see, downloads the .EXE from the CDN for the provider, then passes quiet install parameters on to the exe.
EDIT 12/16/2014: As of this writing the problem with Chocolatey packages not installing unless you run with the signing policy as âUnrestrictedâ has been resolved. Â Leaving the below for posterity.
[âŚ]So, hopefully youâve launched an administrative session of PowerShell and set your execution policy to UnRestricted for the moment. Assuming youâve done so, you should see the following when you run your install for 7Zip or FoxitReader. If youâre not running as an administrative user, youâll get a UAC prompt, which I personally feel is good behavior, then the install will continue. Since these scripts are configured by the application owners, some will be silent installs, some will not.
For instance, if you run the install of Visual C++ 2010 from an administrative PowerShell prompt, the application will install with no prompt whatsoever. All in all, very powerful stuff, and finally brings App-Get like functionality to PowerShell.Â
10/10 would download again.
EDIT: Iâve noticed that Install-Packages has parameters to pass your switches along to the .exe files, and you can see there are a lot of parameters available. However, itâs early in the game and as of this writing the help files donât exist for this and other PowerShell v5 resources. Â
Continue Reading...Solving the DSC Pull Chicken and Egg problem

This post is part of the Learning DSC Series, click the link to explore the rest of the posts!
My 100th Post, Thank you!
Hi guys, itâs here, my 100th post anniversary special! I want to thank all of my loyal readers, commenters, and the folks whoâve liked my blog over the last 18 months for their input, critique and exciting ideas.
Iâd also like to thank my extremely talented friend Joie Brown for designing this wonderful and festive banner for my site to celebrate the occasion! She is a wonderfully skilled artist, illustrator and designer, and you can find out more about her freelance art here at www.joieart.net. Sheâs done work for My Little Pony, popular web comics and more, including her own printed comic book! This banner turned out great and I owe her a lot of gratitude for it.
Honestly, the feedback Iâve gotten from Reddit, Twitter and on my site itself is inspiring, and drives me to make better and better content. Thanks for sticking with me, and please, as always, feel free to e-mail me your questions, topic suggestions, or any critique! Stephen.Owen.ii@Gmail.com
DSCâs Chicken and Egg Problem
Part of my series on replacing and reproducing Group Policy using PowerShell Desired State Configuration.
Anyone whoâs followed my blog or industry news knows that there is a lot of excitement in the Windows World about the growth of PowerShell and the introduction of Desired State Configuration. DSC will surely grow to replace at least Group Policy, and likely also begin chipping away at Configuration Manager (SCCM or ConfigMan, as it is popularly known) for ConfigMgrâs DCM and software distribution. Just my prediction :)
As Iâve covered before on my blog, Desired State Configuration settings currently come to machines in one of two ways: Push or Pull. Hereâs the run-down:
The differences between them highlights one of the current challenges youâll run into with DSC: while you could write and push a DSC configuration out for every system created, it would really be better to instruct your VMs or physical infrastructure to automatically look for configuration settings as theyâre being built.
The Challenges from here
Hereâs the problems we need to solve:
In order for a machine to successfully register a DSC Pull server, at the time you make the registration, a DSC Configuration Guid_must already exist for that server_. Knowing that, how do we ensure a configuration exists for a brand new machine? How do we tell our systems about Pull servers while weâre building them?
Active Directory Group Policy side-steps these issue entirely by delivering config settings down as part of standard Group Policy when a new system joins the domain. Since most builds of Windows machines will have them joining a domain, it really is a very nice configuration package.
So the DSC Chicken and Egg problem, as Iâve coined it, is this:
in order to register a DSC Pull Server Successfully, you must pass a GUID with your Pull settings. However, if youâre trying to assign a Pull server for a brand new machine, how do you ensure that a Configuration with the appropriate Guid exists? And if you can create a config, how do you return it back to the local machine?
In this article, Iâll outline how to configure a machine for to DSC pull while imaging, which can be used in MDT or SCCM Task Sequences. The goal will be to give an example of implanting DSC Pull server settings on our systems while theyâre being built.
Following Along
If youâd like to follow along, I recommend following Jason Helmickâs blog post here on building your own DSC Pull Server. Weâll use his method and setup an SMB based pull-server.
You will need:
Virtual or physical test infrastructure, at least two machines- (If youâd like to test the baked-in approached using MDT or SCCM, youâll need) - One single site SCCM 2012 R2 Server OR - One MDT Server
Assuming you have a newly imaged Server 2012 R2 server, youâll need to install:
Blam, youâve got a DSC Server.
No way, that was too easy!
It was deceptively easy, right? What happened under the covers was that an instance of IIS was spun up and configured to be running a webapp of the Desired State Configuration Pull Service, with a source directory for configs set up as PROGRAMFILES\WindowsPowerShell\DscService\Configuration, and also an instance of the DSC Compliance server (which weâll get into in a later postâŚonce I can understand it!). This script from Jasonâs blog post configured IIS to listen on port 8080, so weâll need to keep that in mind by directing any requests to our DSC server to that port, using this syntax http://ServerName:PortName. You can change the bindings in IIS, if youâd like, or change it in the script before you launch it.
Alright, and to verify that the service is working⌠go to http://PullServerName:8080/psDSCPullServer.svc/
This isnât very human-readable, but if youâre following along from home, you should see something like this. Â Henceforth, when weâre going to provide a pull service URL in our DSC Configs, weâll provide this full path as the URL. Â Our DSC Pull Server exists, satisfying one part of our goals; next, letâs look at how you instruct a client to pull configs down from a DSC Pull Server.
Into the Local Configuration Manager
Beginning with PowerShell v4, there is a new engine running under the covers, the Local Configuration Manager. You can interact with it by using PowerShell commands like Get-DscLocalConfigurationManager. This is where the comparative magic of DSC takes place, where your system evaluates what it should look like, and also where it takes action or reports if it is out-of-compliance. This is an incredibly powerful engine, and I fully believe that the next decade we will spend countless hours coming up with ways to leverage it to our professional and personal success.
Hereâs a screenshot of the default state of a DSC endpoint.
AllowModuleOverWrite : False CertificateID : ConfigurationDownloadManagers : {} ConfigurationID : ConfigurationMode : ApplyAndMonitor ConfigurationModeFrequencyMins : 15 Credential : DebugMode : False DownloadManagerCustomData : DownloadManagerName : LCMCompatibleVersions : {1.0, 2.0} LCMState : Ready LCMVersion : 2.0 MaxPendingConfigRetryCount : StatusRetentionTimeInDays : 7 PartialConfigurations : {} RebootNodeIfNeeded : False RefreshFrequencyMins : 30 RefreshMode : PUSH ReportManagers : {} ResourceModuleManagers : {} PSComputerName :
We will need to modify a few of these values to reflect the settings for our Pull Server. Weâll do that using a DSC configuration resource titled LocalConfigurationManager, which weâll set also using DSC. You can set your system to pull down a config from a pull server using the following syntax:
Configuration SetPullMode { param(\[string\]$guid,$machineName) Node $machineName { LocalConfigurationManager { ConfigurationMode = âApplyOnlyâ ConfigurationID = $guid RefreshMode = âPullâ DownloadManagerName = âWebDownloadManagerâ DownloadManagerCustomData = @{ ServerUrl = âhttp://serverName:8080/PSDSCPullServer.svc'; AllowUnsecureConnection = âtrueâ } } } } SetPullMode âguid $Guid Set-DSCLocalConfigurationManager âComputer servername -Path ./SetPullMode âVerbose
Make sure you include âAllowUnsecureConnectionâ, otherwise DSC will attempt to query for a settings page on port 443 (https instead of http), and youâll have a nasty hour or two worth of errors to solve.
Thanks to Pete Zerger and Steven Murawski for their excellent blog posts on DSC Pull which helped me to understand the settings needed here. Thanks also to Jacob Bensonâs post on DSC Troubleshooting which helped me realize that you do need âAllowUnsecureConnectionâ.
So far weâve seen how to build a DSC Pull Server (EASY!) and also how to instruct a single endpoint to look to the DSC Pull Server for a configuration. Thatâs great and all, but we still havenât created a DSC Configuration for this machine weâre building, nor have we dealt with some of the problems above, like making sure that a DSC Configuration exists, giving it a GUID, signing it with a checksum, and then registering this GUID in the DSC client at build time.
The Flow: Created a DSC Resource on-demand, before you need it
The overall next steps here are as follows, to be conducted while our machine is building:
But how would I automate this in production? Great question! Weâre solving this hairy problem using PowerShell Sessions and Remoting. During the imaging process, weâll include a step to run a PowerShell script which will step out to another server and create a unique GUID, generate a new DSC configuration for our machine, renaming the config to the GUID we created. Finally, weâll use Invoke-Command to pull back the GUID and use that to configure DSC Pull Locally on the new machine.
Assuming weâre using SCCM, what will happen here is that weâll add one step to our Task Sequence, and we add âRun PowerShell Scriptâ.
And here is the code for the script.
Now, what will happen here is that your system will build and then run the remote PowerShell commands to create the .mof for it. Then, it will use DSC to configure itself as a Pull Client, and proceed with the Task Sequence. When the TS finished, within about 30 minutes, the Local Configuration Manager will attempt itâs first Pull, and grab the .mof file then enact the configuration. What weâre doing in our demo is simply copying down the source files for an .MSI, and then when it gets here ensure the MSI is installed. Assuming youâve got a MSI for 7Zip sitting on a share and you run this script, you should see this!
Our old stand-by, installing 7Zip
Wrapping up
If you can run a PowerShell Script while imaging your machine, you can use this approach to bootstrap machines with DSC Pull Settings. Â I recognize that not everybody will want to use SCCM to do this procedure, so here are some other ways you could use the same script to attack this problem:
This was a challenging post for me. Â Frankly, DSC is still a very new technology and not that much has been written about it. Â What Iâve produced here is an answer to a question that has bothered me since I first became aware of DSC with Jeffrey Snoverâs WMF 4.0 talk at TechEd last year. Â I may have made some mistakes, and it is possible that Iâm over-working the whole problem. Â Regardless, this solution has worked in my testing, and I would feel confident deploying this to clients or in my own environment. Â If you spot an error, please let me know! Â Iâd love to make this the perfect solution to the âConfiguration must be there before you can embed itâ problem.
Thanks!
Sources
Steve Murawskiâs Great blog series on the topic on PowerShell.org httpowershell.org/wp/2013/11/06/configuring-a-desired-state-configuration-client/
His site on DSC and Chef configuration methodologies - http://stevenmurawski.com/
Jason Helmickâs Series on ConcentratedTech http://searchservervirtualization.techtarget.com/tip/How-to-push-out-a-Desired-State-Configura-Pull-Server
Download link to the configuration script needed to configure a DSC Pull Server http://cdn.ttgtmecom/rms/editorial/1.CreatePullServer.ps1.txt
Pete Zergerâs awesome post on start-to-finish deploying of a DSC Pull Server http://www.systemcentercentral.com/day-1-intro-to-powershell-dsc-and-configuring-your-first-pserver/
Jacob Bensenâs great post on DSC Troubleshooting http://jacobbenson.com/?p=296#sthash.VDDogyU0.
Mike F Robbins great blog post about dealing with an error that choked me up. I had a Server with WMF 5.0 was pushing the image to a PowerShell v4 PC. http://mikefrobbins.com/2014/10/30/powershell-desired-state-configuraterror-undefined-property-configurationname/#comment-18749
Download link to the newest WMF for PowerShell / DSC http://www.microsoft.comus/download/confirmation.aspx?id=44987
Download link to the newest DSC resources link https://galltechnet.microsoft.com/scriptcenter/DSC-Resource-Kit-All-c449312d
Continue Reading...Working with Web Services, SOAP, PHP and all the REST with PowerShell

In order to truly ascend to the next level, every scripter eventually needs to integrate an outside service into the organization, be it Air-Watch, ServiceNow, LogicMonitor, Azure AD or any other popular service.
Some of these services include ready made PowerShell modules which you can easily integrate into the environment, but others instead present something different, an API, or Application Programming Interface.
These sound scary and âdeveloperyâ but they really arenât so bad. And the great thing is that they all adhere to the same standard, er, or set of standards.
Fortunately, most of the services weâll find will adhere to one of these common standards: SOAP, REST, or PHP.
The goal of this post is to give you an example of how to use each of these standards to interact with the various systems you may run across.
Not only for the web
You may have noticed on the past few posts here that Iâm really getting into APIs. âWhatâs so great about APIs?â you may ask.
APIs allow you to very easily leverage work that someone else has done to quickly create your own functions and get seriously useful output from just a little bit of work. If youâre planning to Orchestrate workflows in your environment, or create runbooks for your data center too, or if you want to make your own tools, learning how to interact with SOAP, REST and WebServices will definitely be in your favor.
The difference between your average âScriptingâ guy and an Automation Engineer or Consultant is the ability to create your own tools, from scratch, using the APIs provided. Thatâs where you start to make the big bucks.
How do I know which standard to use?
Sometimes, weâll be super lucky and the developers for our desired Service will list which type of API theyâre offering. Of course, sometimes weâre not so lucky and have to look elsewhere.
Fortunately, we can use the URL for the service to help determine which PowerShell cmdlets to use! The following chart shows the relationships between URL specification and cmdlet.
URL | Service Type | Cmdlet |
Ends in .asmx or ?WSDL | SOAP | New-WebServiceProxy |
Contains API, especially api/v2 | REST | Invoke-RESTMethod |
Ends in .php | PHP/Form | Invoke-WebRequest |
REST v. Soap, whats the difference?
This is a great question that came up during our userâs group last night. Both REST and SOAP are simply methods of accessing information presented via web services. It will suffice to say that REST is now in vogue, and is generally believed to be easier to use and manage than SOAP, which tends to be a bit heavier on XML.
The best answer Iâve seen came from Dr. M. Ekelstein, who put it the following way: âA nice analogy for REST vs. SOAP is mailing a letter: with SOAP, youâre using an envelope; with REST, itâs a postcard. â
In his blog he gives an example, comparing the response you can expect from SOAP vs. the response from REST. In both examples, weâre querying for the results of a user â12345â. Note the tremendous verbosity of one reply over the other.
SOAP | REST |
<?xml version="1.0"?><soap:Envelope xmlns:soap="http://www.w3.org/2001/12/soap-envelope" soap:encodingStyle="http://www.w3.org/2001/12/soap-encoding"> <soap:body pb="http://www.acme.com/phonebook"> <pb:GetUserDetails> <pb:UserID>12345</pb:UserID> </pb:GetUserDetails> </soap:Body> </soap:Envelope> | http://www.acme.com/phonebook/UserDetails/12345 |
You can imagine how much work would go into parsing out the real juicy bits from the result on the left, versus the result on the right.
Simply put, if you have the option, use REST, itâs much easier to deal with the return objects!
Working with SOAP Protocol
So, weâve determined that weâre working with a SOAP API, either because the service API catalog says so, or we used the handy URL trick to determine that the URL of this service ends in .asmx?WSDL, which is short for Web Services Description Language.
The overall flow of accessing resources from a SOAP source are to access the source using New-WebServiceProxy, storing the results in a variable. Youâll then run Get-Member to look at the methods your WebService offers, and then go from there with accessing it.
You can generally view a WSDL in your browser by, uh, browsing to it. It will be human readable XML code. For this example, weâll be using the handy Length endpoint from WebServiceX.net, which allows us to convert one unit of Length into another. When we open it in a browser, we see the following Service Description.
Fortunately for us, rather than scrolling through pages and pages of XML, PowerShell knows how to interpret this description and let us access it in a PowerShell-y way, using the New-WebServiceProxy cmdlet.
For example:
$url = "http://www.webservicex.net/length.asmx?WSDL"
$proxy = New-WebServiceProxy $url $proxy | gm -memberType Method
TypeName: .AutogeneratedTypes.WebServiceProxy
Name MemberType Definition
---- --------- ----------
ChangeLengthUnitCompleted Event
BeginChangeLengthUnit Method System.IAsyncResult
ChangeLengthUnit Method double ChangeLengthUnit
ChangeLengthUnitAsync Method void ChangeLengthUnitAsync
EndChangeLengthUnit Method double EndChangeLengthUnit
ToString Method string ToString()
So, this helpful output lets us see some interesting Methods() available, all centered around Changing Length Units. Letâs take a peek at the .ChangeLengthUnit() method.
Those definition types are super long! It basically abbreviates down to (âNumberOfUnitsâ,âStartingLengthUnitâ,âEndingLengthUnitâ)
We can give it a try with the following, to convert 15 Meters into a similar number of International Confusing Headache Increments (INCHEs, for short)
$serviceProxy.ChangeLengthUnit(15,"Meters","Inches")
 >590.551181102362
Pretty nifty!
Working with REST
REST APIs are the bomb, and totally fly AF. Theyâre written when the developers of a service truly have extensibility in mind.
For this example, weâll refer back to my Get-Weather function I released about a month ago. When I originally wrote that, I was using Invoke-WebRequest (Which is effectively just loading the web page and scraping itâs contents! Iâve since had a come to Jesus meeting and fixed my code there )
Here are the most pertinent bits of that function:
$API\_key = "$secret" $url = "https://api.forecast.io/forecast/$API\_key/$coords" #Store the results in $weather $weather = Invoke-RestMethod $url -Method Get
#Display the contents of $weather $weather
latitude : 33.9533
longitude : -84.5406
timezone : America/New_York
offset : -5
currently : @{time=1416415006; summary=Clear; icon=clear-day; nearestStormDistance=235; nearestStormBearing=321; precipIntensity=0;
precipProbability=0; temperature=38.67; apparentTemperature=36.25; dewPoint=20.8; humidity=0.48; windSpeed=3.54; windBearing=249;
visibility=10; cloudCover=0.09; pressure=1029.21; ozone=321.84}
minutely : @{summary=Clear for the hour.; icon=clear-day; data=System.Object[]}
hourly : @{summary=Partly cloudy starting this afternoon, continuing until this evening.; icon=partly-cloudy-day; data=System.Object[]}
daily : @{summary=Light rain on Saturday through Tuesday, with temperatures rising to 67âF on Monday.; icon=rain; data=System.Object[]}
flags : @{sources=System.Object[]; isd-stations=System.Object[]; darksky-stations=System.Object[]; madis-stations=System.Object[];
lamp-stations=System.Object[]; units=us}
So,now that weâve seen how easy it is to work with these object oriented services, letâs take a deeper peak under the covers with some PHP/Forms manipulation using PowerShellâs built-in FaceBook example.
Working with PHP/Web Forms Objects
Now, that weâve seen how comparatively easy these were, letâs see how weâd attack a .php/forms login.
One of the things to note about using Invoke-WebRequest is that youâll be getting cozy with the HTTP Verbs of Get, Post, Delete, and others. For this example, weâll use Get and Post.
Weâll run our test using the easiest .php that I know of, the Form Post Tester service on Hashemian.comâs blog. The usage of this service is that you can post any data youâd like to the php system, in the -Body param of your submission. You can pull the data back down later if you append a â/â and a key to your submission, which is handy for testing your HTTP Get.
Hereâs an example.
$FormtesterUrl = http://www.hashemian.com/tools/form-post-tester.php
$accessCode = "/FoxDeploy"
$URI = $FormtesterUrl + $accesscode
Invoke-WebRequest -Uri $uri -Method Post -Body "Test Message From PowerShell"
If you want to test that it worked, you can open up the full URL in a browser, and see something like this.
Now to pull the data back down from there, weâll do an use the âGetâ method instead.
Invoke-WebRequest -Uri http://www.hashemian.com/tools/form-post-tester.php/FoxDeploy -Method Get | Select -expand Content
In more complex scenarios, you could read the HTML of a page and provide values for all of the fields on a page to log in. If you check the Get-Help example, youâll find a very ambitious example that allows you to log into Facebook with PowerShell!
Where to go from here
I hope you liked this quick tour through working with various APIs. For your next steps, you might be interested in how to work with complex authentication, covered here in Using PowerShell and oAuth.
Have a specific question? Iâve written modules for dozens of APIs, from AirWatch, to ServiceNow and even Imgur and can help you get your API needs sorted.  Leave a message below, or on reddit.com/r/FoxDeploy and weâll see what we can do!
Continue Reading...