Powershell | Microsoft Windows Server Blog http://approjects.co.za/?big=en-us/windows-server/blog/tag/powershell/ Your Guide to the Latest Windows Server Product Information Fri, 20 Feb 2026 01:21:49 +0000 en-US hourly 1 http://approjects.co.za/?big=en-us/windows-server/blog/wp-content/uploads/2018/08/cropped-cropped-microsoft_logo_element.png Powershell | Microsoft Windows Server Blog http://approjects.co.za/?big=en-us/windows-server/blog/tag/powershell/ 32 32 Beyond RC4 for Windows authentication http://approjects.co.za/?big=en-us/windows-server/blog/2025/12/03/beyond-rc4-for-windows-authentication/ Wed, 03 Dec 2025 17:00:00 +0000 As organizations face an evolving threat landscape, strengthening Windows authentication is more critical than ever. The deprecation of RC4 (Rivest Cipher 4) encryption in Kerberos is a shift toward modern, resilient security standards.

The post Beyond RC4 for Windows authentication appeared first on Microsoft Windows Server Blog.

]]>
As organizations face an evolving threat landscape, strengthening Windows authentication is more critical than ever. The deprecation of RC4 (Rivest Cipher 4) encryption in Kerberos is a shift toward modern, resilient security standards. RC4, once a staple for compatibility, is susceptible to attacks like Kerberoasting that can be used to steal credentials and compromise networks. It is crucial to discontinue using RC4.

By mid-2026, we will be updating the domain controller default assumed supported encryption types. The assumed supported encryption types is applied to service accounts that do not have an explicit configuration defined. Secure Windows authentication does not require RC4; AES-SHA1 can be used across all supported Windows versions since it was introduced in Windows Server 2008. If existing RC4 use is not addressed before the default change is applied, authentication relying on the legacy algorithm will no longer function. This blog post helps IT professionals transitioning to AES-SHA1 encryption by offering steps to detect and address remaining RC4 usage.

For additional details on our Windows Update rollout strategy, check out this page on how to manage Kerberos KDC usage of RC4.

Detect RC4 usage with new tools

Aside from the Windows Update rollout of changes to domain controller default assumed supported encryption types, RC4 should be completely disabled in domain environments to maximize security. Legacy applications or interoperability with non-Windows devices may still necessitate the use of RC4, and such dependencies will need to be identified and addressed.

To support the identification of RC4 usage, we have enhanced existing information within the Security Event Log and developed new PowerShell auditing scripts. These enhancements are available in Windows Server versions 2019, 2022, and 2025.

New fields within existing Kerberos Events

The Security Event Log on Key Distribution Centers (KDC) logs when a client requests a ticket during authentication and when they request access to a specific service within the domain:

  • 4768: A Kerberos authentication ticket (TGT) was requested
  • 4769: A Kerberos service ticket was requested

New fields have been added to these events to capture all of the encryption algorithms supported by an account and to log the specific algorithm that was used during a ticket request. Using this information, you can now better identify:

  • Authentication client devices that only support RC4
  • Authentication target devices that only support RC4
  • Accounts that don’t have AES-SHA1 keys provisioned, specifically for AES128-CTS-HMAC-SHA1-96 (AES128-SHA96) and AES256-CTS-HMAC-SHA1-96 (AES256-SHA96)

The first important, new field is called msds-SupportedEncryptionTypes. This field specifies the encryption algorithms that an account supports and is provided for both the client machine and the target service in a request. By default, this field should include both AES-SHA1 and RC4. If it does not include AES-SHA1, that indicates an account that we would expect to use RC4, which would need to be remediated.

The next new field, Available Keys, provides information on the encryption keys that have been created for an account in Active Directory. For most accounts in Windows, this should include RC4 and AES-SHA1 already. If this field contains RC4 but not AES-SHA1, it indicates an account that is not ready to use AES-SHA1 and that would need to be addressed.

The last important new field is the Session Encryption Type. This field contains the encryption algorithm that was used for a specific Kerberos request. Most events will indicate AES-SHA1 was used because that is the default behavior for Windows devices and accounts today. Filtering this event for RC4 will help identify potential problematic accounts and configurations.

New PowerShell scripts

Instead of manually reviewing the Security Event log on your domain controllers to find problematic RC4 usage via events 4768 and 4769, let’s introduce two new PowerShell scripts that are available to you on the Microsoft Kerberos-Crypto GitHub repository.

List-AccountKeys.ps1

Use this PowerShell script to query the Security Event Log for the new Available Keys field. The script enumerates the keys that are available for the accounts it finds from the event logs, as well as the following information:

  • The time at which an event happened
  • The account name
  • The account type
  • The account keys

PS C:\tools> .\List-AccountKeys.ps1

Time                  Name         Type Keys

—-                  —-         —- —-

1/21/2025 2:00:10 PM  LD1$      Machine {RC4, AES128-SHA96, AES256-SHA96, AES128-SHA256…}

1/21/2025 2:00:10 PM  AdminUser    User {RC4, AES128-SHA96, AES256-SHA96, AES128-SHA256…}

1/21/2025 6:50:34 PM  LD1$      Machine {RC4, AES128-SHA96, AES256-SHA96, AES128-SHA256…}

1/21/2025 6:50:34 PM  AdminUser    User {RC4, AES128-SHA96, AES256-SHA96, AES128-SHA256…}

1/21/2025 6:50:34 PM  LD1$      Machine {RC4, AES128-SHA96, AES256-SHA96, AES128-SHA256…}

In this case, the results show that there are AES128-SHA96 and AES256-SHA96 keys available for the accounts found in the logs, meaning these accounts will continue to work if RC4 is disabled.

Get-KerbEncryptionUsage.ps1

Use this PowerShell script to query the same events to see which encryption types Kerberos used within your environment. In this example, the requests used AES256-SHA96, which is a part of AES-SHA1.

PS C:\tools> .\Get-KerbEncryptionUsage.ps1

Time       : 1/21/2025 2:00:10 PM

Requestor  : ::1

Source     : AdminUser@CONTOSO.COM

Target     : LD1$

Type       : TGS

Ticket     : AES256-SHA96

SessionKey : AES256-SHA96

Time       : 1/21/2025 2:00:10 PM

Requestor  : 192.168.1.1

Source     : AdminUser

Target     : krbtgt

Type       : AS

Ticket     : AES256-SHA96

SessionKey : AES256-SHA96

With this script, you can try out additional filtering options on specific encryption algorithms. For example, use the RC4 filter to specifically find requests that used RC4:

PS C:\tools> .\Get-KerbEncryptionUsage.ps1 -Encryption RC4

You can also use security information and event management (SIEM) solutions, like Microsoft Sentinel, or built-in Windows event forwarding as described in So, you think you’re ready for enforcing AES for Kerberos? to query these logs.

Recommendations on RC4 usage scenarios

You’ve used the scripts and identified RC4 usage. Now what should you do?

Here are some common scenarios and recommended solutions. For deeper dives, see our official Detect and remediate RC4 usage in Kerberos documentation.

A user account only has RC4 keys

You used the List-AccountKeys.ps1 script and have identified a user or machine account that only has RC4 in the list of keys. To prepare this account to use AES-SHA1 instead of RC4, reset the account password. Resetting the password will automatically create AES128-SHA96 and AES256-SHA96 keys in Active Directory for the account.

A user account doesn’t show support for AES-SHA1

You queried the Security log and found an account where the msds-SupportedEncryptionTypes field does not include the AES-SHA1 encryption types. There are multiple reasons why this may be the case and the most common scenarios are outlined below:

Scenario 1: The source or target account for a request might not have AES128-SHA96 and AES256-SHA96 correctly configured in its supported encryption types. If this is the case, here’s how you can view the policy:

  • You can use Active Directory Users and Computers (ADUC) with Advanced Features enabled (under View > Advanced features). Review the msDS-SupportedEncryptionTypes attribute for an account to confirm the configuration. Find the account of interest in ADUC and right-click the account name. Select Properties and, in the newly opened window, select the Attribute Editor tab. In the list of attributes, find msDS-SupportedEncryption to confirm the configuration of the account. If needed, configure the account to include AES128-SHA96 and AES256-SHA96 using Group Policy.
  • You can also use PowerShell. Use the following Get-ADObject command. Note: The output for mdds-SupportedEncryptionTypes will be in decimal format.

PS C:\> Get-ADObject -Filter “Name -eq ‘LM1’ -and (ObjectClass -eq ‘Computer’ -or ObjectClass -eq ‘User’)”  -Properties “msds-SupportedEncryptionTypes”

DistinguishedName             : CN=LM1,CN=Computers,DC=contoso,DC=com

msds-SupportedEncryptionTypes : 28

Name                          : LM1

ObjectClass                   : computer

ObjectGUID                    : 3a4c6bc4-1a44-4f1f-b74a-02ec4a931947

To interpret the values and to determine the best configuration for your environment, check out Active Directory Hardening Series – Part 4 – Enforcing AES for Kerberos and Decrypting the Selection of Supported Kerberos Encryption Types.

After setting the right combination for your environment, restart the device, and it will update its msds-SupportedEncryptionTypes attributes in the active directory database.

Scenario 2: The source or the target machine might not have the msds-SupportedEncryptionTypes defined in AD and is falling back to the default supported encryption types.

You’ll need to have a more holistic understanding of your environment. Do you know what happens to devices that don’t have a value defined for msds-SupportedEncryptionTypes or the value is set to 0? Normally, these devices will automatically receive the value of DefaultDomainSupportEncTypes. Depending on your individual risk tolerance, consider using one of the following methods to address this scenario:

  • Define the specific msds-SupportedEncryptionTypes value in the account properties to ensure it isn’t falling back to the DefaultDomainSupportedEncTypes.
  • Set the DefaultDomainSupportedEncTypes to include AES128-SHA1 and AES256-SHA1. Note: This will change the behavior of all accounts that don’t have a value for msds-SupportedEncryptionTypes.

The device doesn’t support AES128-SHA96 or AES256-SHA96

The last version of Windows devices that did not support AES128-SHA96 and AES256-SHA96 was Windows Server 2003. We strongly recommend that you migrate to a supported version of Windows as soon as possible.

If you have a third-party device that does not support AES128-SHA1 and AES256-SHA1, we want to hear from you! Please reach out to stillneedrc4@microsoft.com telling us:

  • What is this device?
  • How does it fit into your workflow?
  • What is your timeline for upgrading this device?

Using WAC for configuring allowed encryption types

Microsoft provides a security baseline for Windows Server 2025 to set and audit recommended security configurations. This baseline includes disabling RC4 as an allowed encryption type for Kerberos. You can apply security baselines or view compliance using PowerShell or using the Windows Admin Center.

In Windows Admin Center, you can access the security baseline compliance report by connecting to the server you’ve configured using OSConfig by selecting the Security Baseline tab of the Security blade. In the Security Baselines tab, you can filter for the policy “Network Security: Configure encryption types allowed for Kerberos” to see your current compliance state for allowed encryption types. The compliant values for this policy in the baseline that do not allow RC4 are:

  • 2147483624: AES128-SHA96 + Future Encryption types
  • 2147483632: AES256-SHA96 + Future Encryption types
  • 2147483640: AES128-SHA96 + AES256-SHA96 + Future Encryption

This is an example of the audit report indicating a device with a compliant setting:

This is an example of audit showing devices configured with a setting that is different from the previous compliant values:

Using stronger ciphers

In the current security landscape, RC4 isn’t required to ensure secure Windows authentication. You can use stronger ciphers, like AES-SHA1, for authentication among all supported versions of Windows. We hope that these detection and mitigation tools help you and your organization in your hardening efforts. Please check out official Detect and remediate RC4 usage in Kerberos documentation for more details and scenarios.

The post Beyond RC4 for Windows authentication appeared first on Microsoft Windows Server Blog.

]]>
Windows Server and SQL Server at Microsoft Ignite 2023 http://approjects.co.za/?big=en-us/windows-server/blog/2023/12/04/windows-server-and-sql-server-at-microsoft-ignite-2023/ Mon, 04 Dec 2023 17:00:00 +0000 One common theme stood out throughout Microsoft Ignite 2023: the potential of AI is becoming reality, and it's happening right now.

The post Windows Server and SQL Server at Microsoft Ignite 2023 appeared first on Microsoft Windows Server Blog.

]]>
This year, Microsoft Ignite 2023 took place in Seattle, Washington from November 12 to 15, 2023 and it was such a wonderful experience to meet and interact with nearly 5,000 of you in person, and many more online across the globe. One common theme stood out throughout the event: the potential of AI is becoming reality, and it’s happening right now. One news roundup even called Microsoft’s vision for AI an “everyday reality.”1 Read more about the economic impact of AI in business and industries in a recent IDC study that Microsoft commissioned.

However, as many organizations like yours are eager to innovate with AI for various use cases, it is also very important, if not more, to have a solid IT foundation that can support that ambitious AI vision—from a cost, performance, and security perspective. The last thing companies want is to make a big investment in AI and machine learning initiatives too soon, without the bandwidth, guardrails, or necessary performance in place to support it.

At the heart of your IT estate lies strategic investments you have in business-critical workloads like Windows Server and SQL Server that are and have been the foundation of many organizations for more than 30 years now. The question then becomes—how do you modernize these foundational technologies to make you ready to leverage the full power of AI, that will allow you to adopt AI in a secure, responsible way and gain an edge over the competition?

Catch up on sessions from Microsoft Ignite

Whether you missed some of the sessions at Ignite or just want a recap of all things Windows Server and SQL Server, we’ve got you covered. In this blog, we’re going to showcase the main Ignite sessions for Windows Server and SQL Server, where we had various announcements, demos, and customer testimonials.

Windows Server

What’s New in Windows Server v.Next: In this session, we provide a preview of what’s coming next for Windows Server, a platform that enables IT professionals and developers to modernize their applications and enable hybrid use cases. The topics covered were Active Directory, File Server, Storage, Hyper-V, Security, and more.

Do more with Windows Server and SQL Server on Azure: This session highlights how you can reap more technical and business benefits by running Windows Server and SQL Server on Microsoft Azure. You’ll learn how Azure provides optimal cost benefits, performance, and security for these workloads. Get tips and demos on how to extend Azure innovations to your hybrid and multi-cloud environments with Azure Arc.

Migrate to Innovate: Be AI-ready, secure, and optimize operations: This is an immersive session for IT practitioners on how to migrate to Azure. We highlighted practical steps, demos, and guidance on how migrating to Azure can accelerate the impact of AI in your organization. We then highlighted how you can enhance security and optimize operations once in the cloud and take the first step into Azure with Azure Migrate.

Learn Live: Upgrade and migrate Windows Server IaaS virtual machines: In this online session, you can learn to migrate a workload running in Windows Server to an infrastructure as a service (IaaS) virtual machine and to Windows Server 2022 by using Windows Server migration tools or the Storage Migration Service.

SQL Server

Get superior price and performance with Azure cloud-scale databases: In this session, you can learn how to improve performance with the latest capabilities for Azure SQL Databases, Azure Database for PostgreSQL, and SQL Server enabled by Azure Arc for hybrid and multi-cloud. You’ll learn how customers enabled ongoing innovation by migrating to Azure Database for MySQL. This session will cover tactical ways to get the most from your applications with the databases that are easy to use, deliver unmatched price and performance, support open-source, and enable transformative AI technologies.

Accelerate your SQL migration with Azure Data Migration Service: In this demo, you’ll see how the new Azure Data Migration Service along with Azure Migrate can accelerate your SQL modernization journey. We will showcase Azure Data Migration Service streamlined capabilities for readiness assessment, SKU recommendations based on workload rightsizing, and online and offline data migration across Portal, Azure Data Studio, PowerShell, and command-line interface (CLI) experiences that you need for your SQL Server migration journey to Azure from on-premises.

Migrate to innovate: Modernize your data on Azure SQL Managed Instance: In this session, you can watch new performance enhancements in action and experience the ease of online migration to Azure SQL Managed Instance using the link feature. See how you can continue to modernize on Azure through Microsoft Fabric integration and connections to other Azure services.

Bring enhanced manageability to SQL Server anywhere with Azure Arc: Join this discussion to discover how connecting your SQL Servers to Azure can enhance your management, security, and governance capabilities with live demos. SQL Server enabled by Azure Arc is a hybrid cloud solution that allows you to manage, secure, and govern your SQL Server estate running anywhere from Azure. Our experts will also explore different options for deploying Azure Arc to your SQL Servers at scale.

Next steps to modernize Windows Server and SQL Server

Ready to take the next step in modernizing your Windows Server and SQL Server? Here are some quick resources to get started:

  1. Upgrade to the latest versions of Windows Server to take advantage of the latest capabilities. Learn more about Windows Server 2022 and SQL Server 2022.
  2. Looking to migrate to Azure? Take the first step with Azure Migrate and Modernize, our offering that has programs, offers, support, free tooling, and expert guidance to confidently migrate to Azure.
  3. Join the discussion on our Windows Server Tech Community and SQL Server Tech Community.

1 ITProToday, Microsoft Ignite 2023 Envisions AI as an Everyday Reality, November 16, 2023.

The post Windows Server and SQL Server at Microsoft Ignite 2023 appeared first on Microsoft Windows Server Blog.

]]>
Ten reasons you’ll love Windows Server 2016 #1: PowerShell and DSC http://approjects.co.za/?big=en-us/windows-server/blog/2016/04/07/ten-reasons-youll-love-windows-server-2016-1-powershell-and-dsc/ Thu, 07 Apr 2016 16:00:12 +0000 Manage your servers “DevOps-style” and share code with the PowerShell community In his “Ten Reasons you’ll love Windows Server 2016” video series, technical evangelist Matt McSpirit introduces you to some of the experts behind the most exciting new features in Windows Server 2016.

The post Ten reasons you’ll love Windows Server 2016 #1: PowerShell and DSC appeared first on Microsoft Windows Server Blog.

]]>
Manage your servers “DevOps-style” and share code with the PowerShell community

In his “Ten Reasons you’ll love Windows Server 2016” video series, technical evangelist Matt McSpirit introduces you to some of the experts behind the most exciting new features in Windows Server 2016.

Today, you will meet Keith Bankston, Senior Program Manager on the PowerShell team.  Keith talks about what’s changed for PowerShell in Windows Server 2016.  He speaks to the enhancements in Desired State Configuration (DSC) to aid the transition into DevOps, and how that sets and keeps servers configured properly. In addition, this talk introduces the PowerShell Gallery and PowerShell GitHub depot, where people can find the code to help them get started with PowerShell and DSC.

Learn about these and other new PowerShell features in Windows Server 2016:

Keith would like to personally thank all the users who have taken the time to post their PowerShell resources on PowerShellGallery.com.

Want more PowerShell resources?  You can engage with Keith and our other PowerShell experts at the PowerShell pages at http://microsoft.com/PowerShell, and follow us on the PowerShell blog at https://blogs.msdn.microsoft.com/powershell/.

Check out the other posts in this series:

The post Ten reasons you’ll love Windows Server 2016 #1: PowerShell and DSC appeared first on Microsoft Windows Server Blog.

]]>
Getting Started with PowerShell 3.0 Jump Start http://approjects.co.za/?big=en-us/windows-server/blog/2013/07/03/getting-started-with-powershell-3-0-jump-start/ http://approjects.co.za/?big=en-us/windows-server/blog/2013/07/03/getting-started-with-powershell-3-0-jump-start/#comments Wed, 03 Jul 2013 09:00:00 +0000 Don’t miss this opportunity; get your staff together and learn about PowerShell right from the source! Join Jeffrey Snover, the inventor of PowerShell, together with Jason Helmick, Senior Technologist at Concentrated Technology, as they take you through the ins and outs of using PowerShell for real-time problem solutions and automations.

The post Getting Started with PowerShell 3.0 Jump Start appeared first on Microsoft Windows Server Blog.

]]>
Don’t miss this opportunity; get your staff together and learn about PowerShell right from the source! Join Jeffrey Snover, the inventor of PowerShell, together with Jason Helmick, Senior Technologist at Concentrated Technology, as they take you through the ins and outs of using PowerShell for real-time problem solutions and automations. This will be a high-speed, fun day aimed at IT pros, admins, and help desk persons who want to know how to use this powerful management tool to improve your management capabilities, automate redundant tasks and manage your environment in scale. It’ll prepare you for a second event on August 1, which will go further into scripting, automation, and building tools (cmdlets).

Getting Started with PowerShell 3.0 Jump Start | Register Now

Date: July 18, 2013
Time: 9:00am – 5:00pm
Where: Live, online virtual classroom
Cost: Free!

 

 

The post Getting Started with PowerShell 3.0 Jump Start appeared first on Microsoft Windows Server Blog.

]]>
http://approjects.co.za/?big=en-us/windows-server/blog/2013/07/03/getting-started-with-powershell-3-0-jump-start/feed/ 2
Final Stops for the Windows Server 2012 Community Roadshow: Australia, Serbia, and Thailand! http://approjects.co.za/?big=en-us/windows-server/blog/2013/02/07/final-stops-for-the-windows-server-2012-community-roadshow-australia-serbia-and-thailand/ Thu, 07 Feb 2013 16:10:00 +0000 Hi, this is Christa Anderson, Community Lead for the Windows Server and System Center Group. As we wrap up the Windows Server 2012 Community roadshow this month, I’d like to thank our many MVPs who supported it. They’re great speakers and very knowledgeable, and through their continued efforts this roadshow reached every continent but Antarctica.

The post Final Stops for the Windows Server 2012 Community Roadshow: Australia, Serbia, and Thailand! appeared first on Microsoft Windows Server Blog.

]]>
Hi, this is Christa Anderson, Community Lead for the Windows Server and System Center Group. As we wrap up the Windows Server 2012 Community roadshow this month, I’d like to thank our many MVPs who supported it. They’re great speakers and very knowledgeable, and through their continued efforts this roadshow reached every continent but Antarctica. I am very proud to work with this group, and as a former MVP I am proud to have been among their numbers.

Get on the wait list for an event in Belgrade, Serbia on February 11

Get on the wait list for an event in Perth, Australia on February 11

Get on the wait list for an event in Brisbane, Australia on February 12

Register now for an event in Bangkok, Thailand on February 27

If you’re in the neighborhood of one of these events, I urge you to register if you can or get on the wait list if you’re close to one of the sold-out sessions. You’ll be glad you did!

Thanks,

Christa Anderson

The post Final Stops for the Windows Server 2012 Community Roadshow: Australia, Serbia, and Thailand! appeared first on Microsoft Windows Server Blog.

]]>
Windows Server 2012, PowerShell 3.0 and DevOps, Part 2 http://approjects.co.za/?big=en-us/windows-server/blog/2012/05/30/windows-server-2012-powershell-3-0-and-devops-part-2/ http://approjects.co.za/?big=en-us/windows-server/blog/2012/05/30/windows-server-2012-powershell-3-0-and-devops-part-2/#comments Wed, 30 May 2012 12:09:00 +0000 This concludes my two part series.  In my first post, I provided some background information about PowerShell and DevOps.  In this post, I’ll provide you a bunch of specifics.  PowerShell 3.0, like Windows Server 2012, has a ton of new features and enhancements so I’ll only scratch the surface.

The post Windows Server 2012, PowerShell 3.0 and DevOps, Part 2 appeared first on Microsoft Windows Server Blog.

]]>
This concludes my two part series.  In my first post, I provided some background information about PowerShell and DevOps.  In this post, I’ll provide you a bunch of specifics.  PowerShell 3.0, like Windows Server 2012, has a ton of new features and enhancements so I’ll only scratch the surface. 

While PowerShell has always been focused on the goals of DevOps, PowerShell 3.0 and Windows Server 2012 take this to a new level.  With Windows 2012, we shifted our focus from being a great OS for a server to being a cloud OS for lots of servers and the devices that connect them whether they are physical or virtual, on-premise or off-premise.  In order to achieve this, we needed major investments in:

  1. Automating everything
  2. Robust and agile automation
  3. Making it easier for operators to automate
  4. Make it easier for developers to build tools

Automating Everything
Windows Server 2008/R2 shipped with ~230 cmdlets.  Windows Server 2012 beats that by a factor of over 10 shipping ~ 2,430 cmdlets.  You can now automate almost every aspect of the server.  There are cmdlets for networking, storage, clustering, RDS, DHCP, DNS, File Servers, Print, SMI-S etc. – the list goes on.  If you’ve read blogs about Windows Server 2012, you’ve seen how many things can be done using PowerShell.  If you haven’t kept up to date, check out Jose Barreto’s File Server blog posts, Yigal Edery’s Private Cloud blog posts, Ben Armstrong’s Virtual PC Guy’s Blog posts, the Clustering and High-Availability blog posts or Natalia Mackevicius’ Partner and Customer blog posts and you’ll see what I mean.  Windows Server 2012 is, by far, the most automatable version of Windows ever.

There are already a large number of hardware and software partners that are shipping PowerShell cmdlets and those that haven’t released them yet are  working to quickly deliver them in the next versions of their products.  This was very clear at the recent MMS conference in Las Vegas and I think you’ll see even more support at TechEd.   You should definitely make sure that any product you buy delivers a full set of PowerShell cmdlets.  If it doesn’t, you should think twice and do some due diligence to make sure you are getting a product that is current and is still being invested in.  If they didn’t do PowerShell, what other things they missing?  The good news is that a lot of the products will support PowerShell by the time Windows Server 2012 ships and that the products that have delivered cmdlets found it easy to do and mention the very positive customer feedback they get.  EVERY product that ships PowerShell cmdlets, increases their investment in PowerShell in their next release.

Robust and agile automation

Workflow
We integrated the Windows Workflow Foundation engine into PowerShell to make it simple and easy to automate things that take a long time, that operate against a very large scale, or that require the coordination of multiple steps across multiple machines.  Traditionally Windows Workflow has been a developer-only tool requiring visual studio and a lot of code to create a solution.  We’ve made it an in-the-box solution that operations can easily create a solution using their existing PowerShell scripting skill.  Workflow provides direct support for parallel execution, operation retries, and the ability to suspend and resume operations.  For example, a workflow can detect a problem that requires manual intervention, notify the operator of this condition and then suspend operations until the operator corrects the situation and resumes the workflow.

Operators can use any of the available Workflow designers to create workflows.  However we took it a step further and simplified authoring by extending the PowerShell language with the workflow keyword.  Any operator or developer can now easily author a workflow using the tools that ship in all Windows SKUs.  The behavior of a workflow are different than a function and it has a few more rules but if you know how to write a PowerShell function, you are 80% of the way to being able to write a workflow.  Authoring workflows using PowerShell is much easier than working with XAML and many of us easier to understand than Workflow designer tools.  You also get the benefit of being able to paste them into email and have someone be able to read/review it without having to install special tools.  Below is an example workflow which operates on multiple machines in parallel collecting inventory information in parallel on each of the machines.

The command below will get this inventory information from a list of servers contained in servers.txt and output the results to a file.  If any of the servers is unavailable, the workflow will attempt to contact the server every 60 seconds for an hour.

Workflow is exactly what DevOps practitioners need to reliably and repeatably perform operations.  One of the key techniques of DevOps is A/B testing where two versions of software are deployed and run for a period of time.  They are measured against some goodness metric (e.g. increased sales) and then the winning version is deployed to all machines.  The workflow capabilities allow PowerShell to perform operations against a large number of machines over a large period of time making it easy to automate A/B testing.

Scheduled jobs
We also seamlessly integrated Task Scheduler and PowerShell jobs to make it simple and easy to automate operations that either occur on a regular schedule or in response to an event occurring.  Below is a workflow which is meant to run forever.  It collects configuration information (disk info) and then suspends itself.  The workflow is started and given a well-known name “CONFIG”.  We’ll resume this workflow using Task Scheduler.  In the example, we register a ScheduledJob to run every Friday at 6pm and after every system startup.  When one of the triggers occurs, the scheduled job runs and resumes the workflow using its well-known name.  The workflow then collects the configuration information, putting it into a new file, and suspends itself again.

Robust Networking
In previous releases, PowerShell shipped with remoting disabled by default and required operators to go to each machine and issue the Enable-PSRemoting cmdlet in order to remotely manage it.  As a Cloud OS, remote management of servers via PowerShell is now the mainstream scenario, so we’ve reduced the steps required and enabled PowerShell remoting by default in all server configurations.  We did extensive security analysis and testing to ensure that this was safe.

In Wojtek Kozaczynski’s blog post on Standards-Based management, he described how we made WS-MAN our primary management protocol and kept COM and DCOM for backwards compatibility.  WS-MAN is a Web-Services protocol using HTTP and HTTPS.  While these are effectively REST protocols, PowerShell establishes a session layer on top of these to reuse a remote process for performance and to take advantage of session state.  These sessions were robust in the face of modest network interruptions but would occasionally break when operators managed servers from their laptops over Wi-Fi networks while roaming between buildings.  We’ve enhanced the session layer of WSMAN.  By default, it will survive network interruptions up to 3 minutes.   Disconnected Sessions support was added to PowerShell sessions which give users the option to disconnect from an active remote session and later reconnect to the same session, without losing state or being forced to terminate task execution. You can even connect to the session from a different computer (just like a remote desktop session).

Easier for operators to automate
We wanted to significantly lower the skill level required to successfully automate a complex solution.  Ultimately we want to create a world where operators think about what they want, type it and get it.  Every customer’s needs and scenarios are different so they need to script their own solutions.  Our goal is to make it simple and easy to author scripts gluing together high level task oriented abstractions.  The number one factor in making it simple is cmdlet coverage.  That is why having ~2,430 cmdlets makes Windows Server 2012 so much easier to automate.  A number of these cmdlets are extremely effective in dealing with the messy, real-world life of datacenters.  We have cmdlets to work with REST APIs, JSON objects and even to get, parse and post web pages from management applications if required.

PowerShell 3.0 simplifies the language and utility cmdlets to reduce the steps and syntax necessary to perform an operation.  Below is an example showing the old way of doing something and the new simplified syntax.

PowerShell3.0 improves the authoring tools operators use to create scripts and author workflows.  PowerShell-ISE now supports rich IntelliSense, snippets, 3rd party extensibility and a Show-Command window which makes it easy to find exactly the right command and parameters you need to accomplish a task.

Easier for developers to build tools
Developers have always loved scripting with PowerShell because of its power, its use of C language conventions and its ability to program against .Net objects.  PowerShell 3.0 cleans up a number of seams in dealing with .NET and objects and expands to allow developers to use PowerShell in a much wider range of scenarios.

Tool building enhancements
PowerShell 3.0 now has an Abstract Syntax Tree (AST).  This allows new classes of intelligent tools to create, analyze, and manipulate PowerShell scripts.  One of the Microsoft cloud services depends upon a very large number of PowerShell scripts to run all aspects of the service.  Their development team used the AST to develop a script analysis tool to enforce a set of scripting best practices for their operators.  The public AST is the reason why IntelliSense is freakishly powerful.  It uses the AST to reason about the actual behavior of the program.

We modified a number of key areas of PowerShell to make them easier for developers to use and extend to write their own tools.  This includes access to our serializer, API improvements, and an extensibility model for PowerShell_ISE.

Scripting enhancements
PowerShell 3.0 now uses the .NET Dynamic Language Runtime (DLR) technology.  PowerShell monitors how a script is executing and will compile the script or portions of the script on the fly to optimize performance.  Performance varies but some scripts run 6 times faster in 3.0.

Intellisense (and tab completion on the command line) now work with .NET namespaces and types.

It is able to reason about the program and use variable type-inferencing to improve the quality of the IntelliSense.

We extended our hashtable construct with two variations which make it much easier for developers to get the behavior they want:

Platform building enhancements
We have streamlined the process to support delegated administration scenarios.  PowerShell 3.0 allows you to register a remoting endpoint, configure what commands it makes available and specify what credentials those command should run as.  This allows you to let regular uses run a well-defined set of cmdlets using Admin privileges.  We’ve simplified the process of defining which cmdlets are available to using a declarative session configuration file.

PowerShell 3.0 is also available as an optional component of WINPE.

Windows Server 2012 and PowerShell 3.0 are excellent DevOps tools
DevOps is a new term and there is some disagreement about what it entails but at the heart it is all about making change safe through automation and bridging the gap between operators and developers.  There is a lot to do in this area but Windows Server 2012 and PowerShell 3.0 make excellent progress towards accomplishing those goals.  PowerShell won’t be the only tool in your DevOps toolbox but it should be in every DevOps toolbox.  Download the beta today and find out for yourself.

The post Windows Server 2012, PowerShell 3.0 and DevOps, Part 2 appeared first on Microsoft Windows Server Blog.

]]>
http://approjects.co.za/?big=en-us/windows-server/blog/2012/05/30/windows-server-2012-powershell-3-0-and-devops-part-2/feed/ 9
Windows Server 2012, PowerShell 3.0 and DevOps, Part 1… http://approjects.co.za/?big=en-us/windows-server/blog/2012/05/29/windows-server-2012-powershell-3-0-and-devops-part-1/ http://approjects.co.za/?big=en-us/windows-server/blog/2012/05/29/windows-server-2012-powershell-3-0-and-devops-part-1/#comments Tue, 29 May 2012 09:55:00 +0000 In the first of a two part series, I provide some background information about PowerShell and DevOps.  In the second post, I’ll provide you a bunch of specifics.  PowerShell 3.

The post Windows Server 2012, PowerShell 3.0 and DevOps, Part 1… appeared first on Microsoft Windows Server Blog.

]]>
In the first of a two part series, I provide some background information about PowerShell and DevOps.  In the second post, I’ll provide you a bunch of specifics.  PowerShell 3.0, like Windows Server 2012, has a ton of new features and enhancements so I’ll only scratch the surface

The first time I heard DevOps was a podcast describing the 2009 Velocity conference.  While most of the industry was struggling to deploy releases a few times a year, John Allspaw and Paul Hammond rocked the house with the talk “10 Deploys Per Day: Dev And Ops Cooperation at Flickr”.  They made the case for delivering business results through changes in culture and tools, and gave birth to a new term: DevOps.  The problem is that developers think they are responsible for delivering features and operators are responsible for keeping the site running.  The gap between developers and operators leads to finger-pointing when things go wrong.  Successful business requires an IT culture of joint accountability and mutual respect: developers thinking about the needs and concerns of operators and operators thinking about the needs and concerns of developers.

Their talk described how businesses required rapid change but that change is the root cause of most site-down events. Shunning the traditional “avoid change” approach, they advocated minimizing risk by making change safe through automation.  This is the job of DevOps – safe change.  This was the Taguchi quality approach applied to IT operations.  Taguchi observed that the root cause of poor quality was variation.  The solution was to first figure out how to do something repeatably.  Once you could do that, then you can make small modifications in the process to see whether they make things better or worse.  Back out the changes that make things worse. Keep doing the things that make things better.  The key is repeatability.  Repeatability allows experimentation which drives improvement.  We get repeatability in IT operations through automation.

We envisioned a distributed automation engine with a scripting language which would be used by beginner operators and sophisticated developers.   PowerShell’s design was driven by the same thinking and values that drove the birth of DevOps:

  1. Focus on the business
  2. Make change safe through automation
  3. Bridge the gap between developers and operators

Focus on the business
PowerShell has always focused on people using computers in a business context.  PowerShell needed to be consistent, safe, and productive.  Much has been made of the similarities between PowerShell and UNIX but in this regard, our ties are much closer to VMS/DCL and AS400/CL.

Consistent:  Operators and developers don’t have a lot of time to learn new things.  A consistent experience lets them to invest once in a set of skills and then use those skills over and over again.  PowerShell uses a single common parser for all commands and performs common parameter validation delivering absolute consistency in command line syntax.  PowerShell cmdlets are designed in a way that ubiquitous parameters can provide consistent functions to all commands (e.g.  –ErrorAction, –ErrorVariable, –OutputVariable, etc)

Safe:  An Operator once told me that occasionally he was about to do something and realized that if he got it wrong, he would be fired.  In PowerShell, if you ever execute a cmdlet which has a side-effect on the system, you can always type –WhatIf to test what would happen if you go through with the operation.  We also support –Confirm, -Verbose and –Debug.  Despite these safeguards, things can go wrong and when they do, PowerShell spends a lot of effort to speed up the process of diagnosing and resolving the error.

Productive:  Every aspect of PowerShell’s design maximizes the power of users (ergo the name).  PowerShell makes it easy to perform bulk operations across a large number of machines.  PowerShell also makes it easy to have productive engagements between your operators and developers because it allows them to speak a common language and to help each other with their scripts.

Make change safe through automation
There has been a lot of discussion about whether PowerShell is a .Net language, a scripting language, or an interactive shell.  PowerShell is a distributed automation engine with a scripting language and interactive shell(s).   Interactive shells and a scripting language are critical components but the focus has always been on automation through scripting.  Automation is the process of reducing and/or eliminating operations performed by a human.  A script documents what is going to happen.  People can review a script and you can modify it based upon their feedback.  You can test the script, observe the outcome, modify the script and if modification is good, keep it and it if is bad back it out. In other words, scripting provides the repeatability required to apply the Taguichi method to IT operations.  Once you have an automated process, you can safely apply it over and over again.  These processes can now be performed reliabily by lower skilled admins.  These steps aren’t possible when you use traditional GUI admin tools.

Bridge the gap between developers and operators
Our goal has always been to deliver a single tool which could span the needs of operators doing ad hoc operations, simple scripting, formal scripting, advanced scripting and developers doing systems-level programming.
PowerShell spends a ton of effort trying to project the world in terms of high level task-oriented abstractions with uniform syntax and semantics.  We call these cmdlets. And this is what operators want to efficiently and effectively manage systems.  In order to copy a file using APIs, you would do this:

Have you ever wondered why PowerShell uses curly braces {} (and other C constructs) instead of BEGIN/END as other scripting languages do?  We did that because we wanted to make it easier to adopt by developers of other C-based programming languages: C++, Objective C, Java, JavaScript, Perl, PHP, etc.  We did some testing and determined that operators were able to readily adapt to this syntax.  We also wanted to provide a smooth glide path between PowerShell and C# .  This provides career mobility for operators who might want to transition to being a developer.

Most importantly, we wanted to develop a tool which could be used by BOTH operators and developers to bridge the gap between the groups and allow them to create common scripts, learn from each other and work together.

Windows Server 2012 and PowerShell 3.0 are excellent DevOps tools
DevOps is a new term and there is some disagreement about what it entails but at the heart it is all about making change safe through automation and bridging the gap between operators and developers.  There is a lot to do in this area but Windows Server 2012 and PowerShell 3.0 make excellent progress towards accomplishing those goals.  PowerShell won’t be the only tool in your DevOps toolbox but it should be in every DevOps toolbox.  Download the beta today and find out for yourself.

The post Windows Server 2012, PowerShell 3.0 and DevOps, Part 1… appeared first on Microsoft Windows Server Blog.

]]>
http://approjects.co.za/?big=en-us/windows-server/blog/2012/05/29/windows-server-2012-powershell-3-0-and-devops-part-1/feed/ 11
Announcing Windows Server 2008 R2 Beta! http://approjects.co.za/?big=en-us/windows-server/blog/2009/01/07/announcing-windows-server-2008-r2-beta/ http://approjects.co.za/?big=en-us/windows-server/blog/2009/01/07/announcing-windows-server-2008-r2-beta/#comments Wed, 07 Jan 2009 21:20:00 +0000 In the life of Windows Server, today marks general availability of public beta for the new Windows Server 2008 R2-and, for me personally, I’ve never had my geeky mitts on a better version.

The post Announcing Windows Server 2008 R2 Beta! appeared first on Microsoft Windows Server Blog.

]]>
In the life of Windows Server, today marks general availability of public beta for the new Windows Server 2008 R2-and, for me personally, I’ve never had my geeky mitts on a better version. The new release incorporates a host of new features and capabilities that I hope you’ll check out; the code is as stable a beta as I’ve ever seen and combined with the beta of Windows 7 you’ll be able to evaluate not just a bevy of new server-side capabilities, but a new level of synergy between server and client operating systems, too.

A quick recap of my favorite highlights:

  • While the Windows 7 client is available in both x86 and x64 versions, Windows Server 2008 R2 is Microsoft’s first 64-bit only OS. It also supports up to 256 logical processors, which opens up a whole new world of enterprise-class back-end processing power.
  • Your existing servers will run faster, too, because Windows Server 2008 R2 takes advantage of the latest CPU architecture enhancements. You’ll also get significant power management improvements via features like Core Parking.
  • Hyper-V in R2 now has Live Migration, allowing IT admins to move VMs across physical hosts with no interruption of service or network connectivity and significant network performance improvements. VMs in Hyper-V for R2 also get greater access to physical resources, namely support for 32 logical processors. It all adds up to the most flexible virtual data center in Microsoft’s history.
  • Check out PowerShell 2.0. Next to Live Migration, “more PowerShell” is the most consistent customer request we’ve had from Windows Server 2008. So, you’ll find over 240 new cmdlets out of the box along with new dev tools for building your own cmdlets that are not only more robust, but easier, too. The new PowerShell is so powerful, we’re starting to build GUI-based management consoles that are based entirely on PowerShell in the background-check out the new Active Directory Administrative Center for starters.
  • RDS is another big-time update. What used to be called Terminal Services has now evolved into Remote Desktop Services with the R2 release. Key in RDS is the new Virtual Desktop Infrastructure (VDI), which allows you to centralize Windows desktops in the data center as virtual machines in addition to the traditional session-based remote desktop model we all know and love from Terminal Services. But VDI is only one new feature in RDS. Others include better end-user fidelity with features like true multiple monitor support and high-end audio and video so you’ve got more breadth in the kinds of applications you can centralize. And the new RemoteApp and Desktop connections feature integrates tightly enough with Windows 7 that users of the new desktop OS won’t need to practically differentiate between what’s local and what isn’t. It all runs off the Start menu.
  • And speaking of Windows 7…Windows Server 2008 R2 is a powerful upgrade to any Windows Server data center all by itself. But in combination with Windows 7 on the client side you’ll enter a whole new world of manageability and productivity:
  • DirectAccess makes remote access ubiquitous (I’m nuts about this one),
  • BranchCache can improve content retrieval at branch offices while simultaneously decreasing WAN bandwidth costs,
  • New Group Policy objects allow deeper control of client desktop management, including access, system monitoring and even physical resources like power management,
  • You’ll be able to manage and keep data safe even on removable drives by using BitLocker to Go.

And those are just my favorite four. This list hardly encompasses all that Windows Server 2008 R2 has to offer. Check out the full kit for yourself at the Windows Server 2008 R2 Web site. And as always, we’re looking for feedback so keep those comments coming.

The post Announcing Windows Server 2008 R2 Beta! appeared first on Microsoft Windows Server Blog.

]]>
http://approjects.co.za/?big=en-us/windows-server/blog/2009/01/07/announcing-windows-server-2008-r2-beta/feed/ 15
GUI config utility for server core http://approjects.co.za/?big=en-us/windows-server/blog/2008/03/31/gui-config-utility-for-server-core/ http://approjects.co.za/?big=en-us/windows-server/blog/2008/03/31/gui-config-utility-for-server-core/#comments Mon, 31 Mar 2008 15:56:00 +0000 Do you want to use the server core installation option of Windows Server 2008, but are put off by the command-line interface? Well, Guy Teverovsky may have what you need. He created a GUI for the GUI-less server core of WS08.

The post GUI config utility for server core appeared first on Microsoft Windows Server Blog.

]]>
Do you want to use the server core installation option of Windows Server 2008, but are put off by the command-line interface? Well, Guy Teverovsky may have what you need. He created a GUI for the GUI-less server core of WS08.

Patrick

The post GUI config utility for server core appeared first on Microsoft Windows Server Blog.

]]>
http://approjects.co.za/?big=en-us/windows-server/blog/2008/03/31/gui-config-utility-for-server-core/feed/ 2
MSD evolution and leadership http://approjects.co.za/?big=en-us/windows-server/blog/2007/07/25/msd-evolution-and-leadership/ http://approjects.co.za/?big=en-us/windows-server/blog/2007/07/25/msd-evolution-and-leadership/#comments Wed, 25 Jul 2007 17:26:00 +0000 Hi, my name is Brad Anderson and I’m the General Manager, leading product development and engineering for Microsoft’s Management and Solutions Division. Many of you have heard that I recently assumed leadership for MSD after Kirill Tatarinov moved over to lead MS Business Solutions Division.

The post MSD evolution and leadership appeared first on Microsoft Windows Server Blog.

]]>
Hi, my name is Brad Anderson and I’m the General Manager, leading product development and engineering for Microsoft’s Management and Solutions Division. Many of you have heard that I recently assumed leadership for MSD after Kirill Tatarinov moved over to lead MS Business Solutions Division. Looking back at my four years with Kirill, I feel confident that Microsoft customers, from small business up to the enterprise, have much better managed IT environments today. During the last 4 years, we rolled out the Dynamic Systems Initiative, and delivered market leading products like SMS, MOM, instrumentation technologies such as Windows PowerShell and have evolved into the world of System Center with the recently released Operations Manager and coming this Fall, Configuration Manager, Virtual Machine Manager, and new release of Data Protection Manager (Service Manager is planned for release next year). Also, our annual Microsoft Management Summit has grown into a major industry event with thousands of customers and partners.
 
But today, I want to reach out to Windows Server customers and partners to share my enthusiasm about System Center management tools, and how these tools lead to better managed and secured IT environments for customers adopting Windows Server 2008, SQL Server 2008 and Visual Studio 2008.
 
With today’s increasingly complex IT environment and ever-changing threat landscape, there’s a clear need for solutions that allow you to more easily secure and manage your infrastructure. To address this need and simplify the task of securing and managing IT, security solutions and systems management solutions must integrate more seamlessly. That’s why in May we outlined our vision for integrated security and management solutions and announced key product milestones under the Forefront and System Center brands. This is a key area of focus for our group moving forward. Today, Forefront and System Center products are optimized for Active Directory and other Windows technologies to enable policy configuration and enforcement. In the future, Forefront and System Center solutions will be unified through a common service management solution enabling workflow definition, process automation and comprehensive reporting across security and management. By doing so, security specialists and management specialists can standardize their processes for gathering and prioritizing incidents; assign resources to address issues manage changes and resolve problems.
 
Two weeks ago we announced that the 2008s [Windows Server, SQL Server, Visual Studio] would launch in February next year, with availability staged throughout the year. We’re in the process of building the best management tools so that developers can build knowledge into their codes and it’ll automatically execute based on that knowledge and policies. Visual Studio 2005 Team System introduced developers to SDM, and we’re working with the Visual Studio team to further connect software developers with architects and IT admin via Visual Studio 2008.
 
For Windows Server 2008, new management packs/agents for MOM 2005 and SC Operations Manager 2007 will be available in H1. Along those lines, we’ll also have a release of System Center Configuration Manager (formerly SMS) and a second release of Virtual Machine Manager to manage virtualized workloads enabled with Windows Server Virtualization. With VMM, System Center can manage the physical and virtual assets. We developed this technology as, increasingly, customers told us they want a single, unified solution for managing both. I’ve met plenty of customers with physical servers in the datacenter operating at only 15% CPU capacity. SC Virtual Machine Manager assesses and then consolidates suitable server workloads onto virtual machine host infrastructure; this frees up physical resources for repurposing or hardware retirement.
 
Our software design goals will continue to focus on our dynamic IT vision. With the release of SC Operations Manager earlier this year, and with broader industry partnerships, we’ve begun delivering on that vision.
The products we are releasing and those in the pipeline, together with the collaboration with our industry partners, will yield tremendous value for our customers in terms of management efficiency and agility. To be sure, my four years at Microsoft has seen some significant innovations in our software products, instrumentation and strategy. I’m excited to lead the teams that will create this software.
 
Brad Anderson
General Manager
MSD
 

The post MSD evolution and leadership appeared first on Microsoft Windows Server Blog.

]]>
http://approjects.co.za/?big=en-us/windows-server/blog/2007/07/25/msd-evolution-and-leadership/feed/ 6