X Tutup
The Wayback Machine - https://web.archive.org/web/20201101021739/https://github.com/PowerShell/PowerShell/issues/6745
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it time for "PowerShell vZeroTechnicalDebt" and/or for an opt-in mechanism into new/fixed features that break backward compatibility? #6745

Open
mklement0 opened this issue Apr 26, 2018 · 69 comments
Labels

Comments

@mklement0
Copy link
Contributor

@mklement0 mklement0 commented Apr 26, 2018

Note that the term technical debt is used loosely here to mean "accumulated broken behavior that can't be fixed without breaking backward-compatibility"; strictly speaking the term has a different, specific meaning.


Update: @rjmholt has officially started a discussion about how to implement and manage breaking changes: #13129.


Note: This issue, which arose out of #5551 (comment), is just to get the discussion started to see if there's a fundamental willingness to entertain such changes. Eventually, RFC(s) are needed.

PowerShell's steadfast commitment to backward compatibility has served the community very well over the years.

On the flip side, the inevitable by-product was the accumulation of technical debt that requires memorizing exceptions and is a barrier to newcomers.

Certain fundamental problems that exist today can only be solved at the expense of backward compatibility, and there are two - not mutually exclusive - ways to handle that:

  • Implementation of a "PowerShell vZeroTechnicalDebt" edition that sheds all technical debt, but forfeits backward compatibility (possibly with future versions using semantic versioning to indicate compatibility)

  • Integration of backward-compatibility-breaking features / fixes into the existing code base, available strictly on an opt-in basis only.

Which approaches, if any, are we willing to consider?
Once we have clarity on that, we can flesh out the processes.

Here are some of the existing fundamental problems that can only be solved at the expense of backward compatibility; I'm sure others will think of more:

  • The complexity and inconsistency of current error handling and problematic [lack of] integration with "native" (external) programs - #3996

  • Inconsistent, hard-to-predict preference-variable / common-parameter inheritance - #4568

  • Performance issues due to [object[]] being the fundamental collection type - #5643 (comment)

    • While PowerShell understandably will never match the speed of Unix utilities, it is all the more important to make interop with them predictable and as painless as possible - see remaining issues re quoting and parsing.
  • Problematic dynamic features that should work lexically: #3879 (comment) re break and continue, https://github.com/PowerShell/PowerShell-RFC/blob/master/1-Draft/RFC0003-Lexical-Strict-Mode.md re Set-StrictMode (though the latter may be fixed without breaking compatibility)

  • [psobject]-related problems (though perhaps they can be fixed without breaking compatibility): #5551, #5579, #4343, #5763

  • The unfortunate -LiteralPath / -Path split - brief rationale at #6714 (comment) and escaping woes at #6714 (comment) (not sure there's a good solution)

  • Broken quoting for external programs - #3734, #5576 (and others) and the related languishing RFC

  • -Command and -File CLI argument parsing - #4024 (comment), #3223 - and general misalignment with the CLI of POSIX-like shells - #3743 - including user profiles getting loaded by default even in non-interactive (script) invocations - #992

  • Inconsistent and surprising parsing of compound command-line arguments - #6467 - and non-parameter tokens that look like parameters - #6291, #6292, and #6360

  • Broken handling of ValueFromRemainingArguments parameters, as argued in #2035 (comment), with broken behavior shown in #5955 and #5122 - going back to #2038

Environment data

Written as of:

PowerShell Core v6.0.2
@BurtHarris
Copy link

@BurtHarris BurtHarris commented Apr 26, 2018

Interesting. I've always wondered why the 1 in .ps1 would be used to add a sort of semantic versioning to PowerShell scripts, allowing a script to indicate if it was adapted to possible breaking changes.

@Jaykul
Copy link

@Jaykul Jaykul commented Apr 26, 2018

There expensive to fix problems which I think of as the technical debt of PowerShell not adopting new .NET concepts as they came along

  • Language syntax for type parameters for generics methods #5146

  • Engine support for Extension Methods to automatically surface in the type system (like they do in compiled .Net languages). #2226

  • Language support for async APIs #6716 (and RFC)

There are a few regrettable features:

  • Could we do something based on on type hints (like @KirkMunro's FormatPx) instead of cmdlets that output formatting objects? #4594 #4237 #3886

  • Would you normalize the Registry provider to be content based -- instead of feeling like a demo of how a property provider could work? #5444

  • Would you refactor PSProviders deeply to make them easier to write? It's a shame that it takes two layers of abstraction to arrive at "SHiPS" and finally give people a way to write providers that they're willing to use

How about a bigger question:

Would we be willing to reconsider the "many ways is better" approach and work on a "pit of success" approach? I mean, would we be willing to remove features that are considered "the easy way" but which are fundamentally worse, in favor of having only "the right way" to do things? E.g.:

  • Make CmdletBinding always on
  • Make @() notation mean a List[PSObject]
  • Make @{} mean a Dictionary[PSObject, PSObject] (or Dictionary[string, PSObject] 😲)
  • Make Process the default block
  • Maybe even make ValueFromPipelineByPropertyName the default ;-)
@mklement0
Copy link
Contributor Author

@mklement0 mklement0 commented Apr 27, 2018

All great suggestions, @Jaykul.

Re making @() notation mean a List[PSObject]: I think we can take this further and use an efficiently extensible collection type as PowerShell's fundamental collection type, so that it is used wherever [object[]] currently is (and is used internally as well as when returning collections, with no type-conversion performance penalty); there are challenges around + use, but @PetSerAl has suggested a custom list implementation to deal with that.

Re making Process {} the default block in functions, as an aside: it's what Filter currently does, but it is limited to the - implied - Process block, and this function variant seemingly never really caught on; unifying the two in the context of Function makes sense to me.

@KirkMunro
Copy link
Contributor

@KirkMunro KirkMunro commented Apr 27, 2018

Funny, on the Azure PowerShell community standup call right now, and they're going to move from AzureRm to a new Az module that is cross platform (no separate AzureRm.Core), and with all command prefixes changed from AzureRm to Az. The two will be side by side for a while, but they're moving to the new command set.

Too bad PowerShell hasn't had an opportunity yet to create a new executable that would run side-by-side with Windows PowerShell, but that could break away from some of the legacy crud that drags it down.

@bergmeister
Copy link
Contributor

@bergmeister bergmeister commented Apr 29, 2018

Just imagine what would happen should we go for it:
Realistically speaking it would take at least 6 months of focussed effort to come up with a 'no regrets' version and another 6 months to reiterate on it. Then 3rd party modules need to adapt to it as well in order for this version to be useful, which will take at least another 6 months to get a reasonable coverage (and bear in mind that some modules never will)... Then account for delay and unexpected problems, etc... So, no, I think it is only wishful thinking that one can get rid of all technical debt in one version in one go (and still develop and not just maintain the old version).

As much as I wished that such a version existed, I think it will only be possible to get to it slowly one breaking change at a time. With v6 a lot of breaking changes were already accepted but I think if one includes too many breaking changes in one version, it will become too complex to upgrade existing scripts/modules. It's good to discuss the most valuable breaking changes but until there isn't an LTS version of pwsh, I do not think it is time to think about having a 2nd train of pwsh with more substantial changes in parallel to the existing mainstream version.

@BrucePay
Copy link
Collaborator

@BrucePay BrucePay commented Apr 30, 2018

@bergmeister Agreed. However even relatively small changes on a core path can seriously hinder adoption. Look at Python 3. It took 10 years to really catch on. With much bigger changes, who knows how long it will take for Perl 6 to be dominant (and it took them 15 years to come up with their right stuff so 1.5 years for PowerShell++ seems optimistic :-)) On the other hand PHP seems to break things on a regular basis, possibly due to the way and what it's used for.

@BurtHarris
Copy link

@BurtHarris BurtHarris commented Apr 30, 2018

Python 3 is certainly the horror show, has it really caught on yet? I'm still running 2.7, and don't plan on upgrading any time soon. But I haven't heard much about Perl 6 recently either...

I think the lessons to learn from Python there are to separate breaking changes to the language from a version change to the engine. Hypothetically, a PS 7 engine could still run earlier scripts (.PS1) files in a no-breaking changes mode, while if the script were marked as 7 aware (say with a .PS7 extension) they could declare they have been updated and that the require at-least PS 7 to run. Hope that makes sense.

@BurtHarris
Copy link

@BurtHarris BurtHarris commented Apr 30, 2018

Perhaps the best success story is JavaScript/TypeScript/Babel. A transpiler (with source-map support) seems like the way-to-go for language evolution.

@BrucePay
Copy link
Collaborator

@BrucePay BrucePay commented Apr 30, 2018

Javascript is a special case. You're pretty much stuck with it so transpiling is really the only option. Typescript is "just" Javascript with extensions so it's easy for people to adopt. Any javascript program is a typescript program so you start with what you have and just add annotations from there. Dart, on the other hand, is it's own language but transpiles to either javascript or a native runtime in Chrome (at least that was the plan at one point). Dart doesn't seem to have picked up much adoption outside of Google, likely because it is its own language..

@BurtHarris

Python 3 is certainly the horror show, has it really caught on yet?

I was reading an article last week where the author was claiming critical mass had been achieved for Python 3. That all the core modules were available and now people where migrating in droves. We shall see..

Interesting. I've always wondered why the 1 in .ps1 would be used to add a sort of semantic versioning to PowerShell scripts, allowing a script to indicate if it was adapted to possible breaking changes..

WRT .ps1, we reserved the right to change the extension in case we got the language completely wrong resulting in catastrophic changes to the runtime such that scripts for the previous version just wouldn't work. But changing the extension also leads to a huge pile of work because so many things in the ecosystem are tied to an extension. So it's not something to do lightly. And of course, being part of Windows, if we forked the extension, we'd still have to maintain both versions (kind of like Python 2/3).

@mklement0
Copy link
Contributor Author

@mklement0 mklement0 commented Apr 30, 2018

@bergmeister:

Valid concerns, but in the spirit of:

It's good to discuss the most valuable breaking changes

please share any that you may have in mind.

slowly one breaking change at a time

While a (lexically scoped) opt-in mechanism for incompatible changes is a solution, I'm concerned about two things:

  • Piecemeal introduction can lead to no one being able to remember which version is required for what feature; Perl (v5-) comes to mind.

  • The code base becoming bloated and hard to maintain, and the resulting binary being equally bloated, which hurts (at least) startup performance.

I've said it before: to me - and this is just a hunch - the v6 GA was an unfortunate compromise between making old-timers unhappy with breaking changes while carrying forward enough baggage to hinder adoption In the Unix[-like] world.

That said, given PowerShell's relative youth in the Unix[-like] world, perhaps there is (still) more of a willingness to work out problems by way of incompatible changes. As @BrucePay states, Windows PowerShell will have to be maintained anyway, and it can remain the safe haven for backward compatibility.

@alx9r
Copy link

@alx9r alx9r commented May 4, 2018

I think the scale of the negative consequences of such breaking changes has already been covered, and I agree with most of those points. I am writing this because I am doubtful that, in the greater context, these changes would yield significant benefits in the first place. At least for the way I use PowerShell.

The OP includes the following statement:

On the flip side, the inevitable by-product was the accumulation of technical debt that requires memorizing exceptions and is a barrier to newcomers.

The implied premise of this statement seems to be that making these various breaking changes would alleviate the burden of memorizing exceptions. Indeed that would be a great outcome. However, I am skeptical that that would be the result. PowerShell's behavior is deliberately rich. This makes it both expressive and unpredictable. Expressiveness and predictability seem to work against one another, at least amongst the languages I am familiar with.

While I agree that many of the breaking changes mentioned above would improve predictability somewhat in some cases, many unpredictable and surprising aspects of the language will remain. Despite spending years writing PowerShell, I am still frequently surprised by PowerShell behavior that seems to be by design and probably shouldn't be changed. Some recent examples that come to mind are as follows:

  • conditional delayed binding of [scriptblock] arguments (#6419)
  • when, exactly, implicit enumeration occurs (#5832)
  • conversion of [AutomationNull]::Value to $null during parameter binding (#6357 is related)
  • the impact of break and continue on flow of control in the pipeline (#5811)
  • what happens when you splat $null (SO)
  • how using scriptblocks across session states affects $_ (SO 1, SO 2)

I expect that these examples are just a small fraction of the many other carefully-designed but surprising nuances I have not yet discovered. I selected these examples because

  1. they represent behavior that probably can't be improved upon in PowerShell, and
  2. I have no hope of reliably spotting all of their implications when reading or writing PowerShell.

I'm fine with this. The overwhelming majority of surprising PowerShell behavior does not have lasting impact on my success with PowerShell. Surprising behavior is almost always caught immediately by testing during development. I learn from the things that slip through and use that to improve my testing strategy. This is true whether those surprising behaviors are the kind that could be eliminated or the kind that must remain.

Whether the proposed breaking changes above are made to or not, PowerShell will never become so predictable that I can significantly reduce test coverage. In other words, the way I use PowerShell, I don't think there's much of an upside that's even possible by making the breaking changes proposed above.

(BTW, thank you @mklement0 for directly asking this question. This has been in the back of my mind for a while. It's good to see the opportunity for everyone to say their piece.)

@mklement0
Copy link
Contributor Author

@mklement0 mklement0 commented May 6, 2018

Thanks, @alx9r.

I think it's important to distinguish between intrinsic complexity and extrinsic complexity:

  • Intrinsic complexity stems from the inherent complexity of the concepts being implemented, which in the case of PowerShell has two primary sources: internal complexity from introducing a new paradigm (object-based pipeline) and marrying two distinct worlds (shell syntax and programming-language syntax), and external complexity from interfacing with multiple, disparate outside worlds.

  • Extrinsic complexity stems from leaky abstractions and inconsistencies.

    • Such complexity should be eliminated .

    • If that is not an option due to backward-compatibility concerns, such complexity should be documented as known problems.

    • The script-module variable scoping behavior (which you reference in the context of $_) is more than just a quirk: it is the root cause of a major problem, the previously mentioned #4568.

    • All other issues you mention strike me as falling into the extrinsic category (rather than being the result of careful design), because they all present as inconsistencies that have no obvious (documented) rationale or benefit:

      • It makes more sense for splatting with $null to pass no arguments rather than a positional $null argument.
      • Why would [System.Management.Automation.Internal.AutomationNull]::Value be converted to $null during parameter binding, even though the type is preserved in direct variable assignment? See #9150 (comment)
      • What benefit is there to the dynamic scoping of break and continue, across the call stack, with quiet termination if no enclosing loop is found?

While the wealth of features and the joining of disparate worlds alone makes it hard to remember all requisite intrinsic complexity, minimizing the extrinsic one is still important.

Having to test your intended approach first without just knowing and trusting that it will work - or to have things break unexpectedly due to surprising behavior - is a serious productivity (and enjoyment) hindrance (even though PowerShell commendably makes it very easy to interactively test behavior).

It comes down to solid concepts (that don't contravene intuitive expectations), descriptive naming, and good documentation:

If I don't know it / not sure if I remember correctly, I need to know where to look it up, and have faith that known problems and edge cases are also documented (either as part of the regular help topics or via links from there).

So even if we decide that eliminating extrinsic complexity is not an option, we can at least document it systematically - and the discussion here can serve as the starting point for compiling a "pitfall gallery" (which may also include cases of unavoidable intrinsic complexity that may be surprising).

@HumanEquivalentUnit
Copy link
Contributor

@HumanEquivalentUnit HumanEquivalentUnit commented Aug 6, 2018

we can at least document it systematically - and the discussion here can serve as the starting point for compiling a "pitfall gallery"

Roman Kuzmin has been collecting such a gallery for a while, here: https://github.com/nightroman/PowerShellTraps

@mklement0
Copy link
Contributor Author

@mklement0 mklement0 commented Aug 6, 2018

Thanks, @HumanEquivalentUnit - that looks like a great collection.

Along with the issues collected in this thread, it could form the basis for a gallery that is part of the official documentation, witch each entry augmented with, as appropriate:

  • a design rationale that explains why the behavior, though perhaps surprising, is justified after all (a properly framed explanation may make the element of surprise may go away)

  • an acknowledgement of the problem, stating that:

    • either: it won't be fixed in the interest of backward compatibility.
    • or: that a future change is being considered
    • or: that the issue is already fixed in PS Core
@schittli
Copy link

@schittli schittli commented Sep 19, 2018

It's strange that it has to be asked if it's time for "PowerShell vZeroTechnicalDebt"

Exactly for this there is a required version command.

If something breaks in a new version, the only pain we get is that we have to study and learn the usually small differences.

But we (finally) get a platform, which really gets better with each iteration in the core. And much easier to maintain.

No, it's never a question to kick off bad designs and decisions by new knowhow and technology.

@oising
Copy link
Contributor

@oising oising commented Oct 28, 2018

Also, to flog the living s**t out of a dead horse, it's still stupendously hard to write a well-behaved function or cmdlet that deals with the magical differences between the host's native filesystem and the filesystemprovider. Such nuanced code is difficult to write and often gotten wrong. Here's a post on stackoverflow I answered about seven or eight years ago that's still valid:

https://stackoverflow.com/questions/8505294/how-do-i-deal-with-paths-when-writing-a-powershell-cmdlet

@vexx32
Copy link
Collaborator

@vexx32 vexx32 commented Dec 20, 2018

Ref: #8495

There are some strange operator precedence rules that deserve revisions:

PS> 1, 2, 2 + 1, 4, 4 + 1
1
2
2
1
4
4
1

In most languages, one would more commonly expect this to be the output:

1
2
3
4
5
@SteveL-MSFT
Copy link
Member

@SteveL-MSFT SteveL-MSFT commented Dec 23, 2018

One option that has its own pros and cons is a new feature flag to enable all of the "ideal" behavior so that it's opt-in. If we had some telemetry indicating that most of the usage has moved to the new behavior, we could flip it so it's opt-out. This might all just be a pipe dream as I haven't seen any real world case where such a model worked...

@vexx32
Copy link
Collaborator

@vexx32 vexx32 commented Dec 23, 2018

Indeed that would be nice, but given the rather... thorough... nature of some of these revisions in this thread, it risks potentially completely splitting compatibility of scripts between "normal" and "experimental", and potentially into several pieces, depending on which flags an individual has enabled.

This means we can't really rely on anything that is an experimental feature in that way, nor write scripts and modules that rely on them, unless we attach a huge warning to the docs pages, or potentially prevent them from being imported unless certain flags are enabled.

This might be avoidable, however... if we can, at will, enable and disable specific experimental features on a per-each-module-scope basis. But, given that that only further complicates matters, I'm not even sure if that's a particularly great solution, either.

@SteveL-MSFT
Copy link
Member

@SteveL-MSFT SteveL-MSFT commented Dec 23, 2018

@vexx32 just to be clear, I wasn't suggesting an experimental flag which would eventually become non-experimental. I was thinking something different (perhaps more like Enable-PSFeature as it would be officially supported (and thus protected from breaking changes unlike experimental features)). I was also thinking it would be a single flag where you opt into these new breaking changes as a set rather than individually.

@vexx32
Copy link
Collaborator

@vexx32 vexx32 commented Dec 23, 2018

Oh, in that case... Yeah, absolutely. That would let us neatly package everything together under a single umbrella, and actually use it. Much better than what I was thinking! 😄

@mklement0
Copy link
Contributor Author

@mklement0 mklement0 commented Dec 24, 2018

@jszabo98's addition to the list of issues compiled here, based on #8512:

-and and -or unexpectedly have the same precedence, whereas in most languages (including C#) -and has higher precedence than -or.

Example expression that contravenes expectations:

PS> $true -or $true -and $false
False

Due to left-associativity and -and and -or having the same precedence, the above is evaluated as ($true -or $true) -and $false rather than the expected $true -or ($true -and $false)

@iSazonov
Copy link
Collaborator

@iSazonov iSazonov commented Dec 24, 2018

This might all just be a pipe dream as I haven't seen any real world case where such a model worked...

In fact, we already have a model that works all over the world - LTS. MSFT uses this to develop Windows 10. The model is used to develop Unix distributions.
For us, this means that we could release LTS PowerShell Core versions every two years and include them in Windows 10 LTS versions and Unix LTS version. (One problem is to sync Windows and Unix release dates for LTS versions. I guess MSFT can negotiate with the main Unix companies.)
During these two years, we are collecting minor breaking changes, transferring more significant breaking changes to the next cycle.
For each new LTS version, the ScriptAnalyzer should get a set of new rules to facilitate script migration.
With this model, critical products and scripts will only have to jump from the previous version to the next after thorough testing and migration.

I used this approach when I migrated step by step (version by version) an old system from PHP 4 version to next PHP version until I reached the supported PHP 5.x version. At each step, I had to make relatively small and quick changes, after which the system continued to work for some time until the next step.

Update: It should be in sync with .Net Core LTS versions. I expect that .Net Core 3.1 will be next LTS and PowerShell Core 6.x should have been on the version.

@jzabroski
Copy link

@jzabroski jzabroski commented Apr 16, 2019

@iSazonov Can you please edit and quote what you are replying to, and add some context, such as a hyperlink to documentation of said APIs? TYVM

@iSazonov
Copy link
Collaborator

@iSazonov iSazonov commented Apr 17, 2019

@jzabroski Look files in src\System.Management.Automation\namespaces\ folder.

@Jaykul
Copy link

@Jaykul Jaykul commented Apr 17, 2019

POWERSHELL does this differently

It's bash that's doing it differently - return can only set the (integer) process exit code (that's not what "return" means in other programming languages, that's an exit).

PowerShell can do that too, with exit 5

It's just that we don't use exit codes very often in PowerShell (pretty much only for scheduled tasks and interop with other OSes), because it isn't a process-based shell, and functions should not exit.

At the end of the day, it's the same reason why detecting the stdout handle type isn't useful in PowerShell (and why "redirecting" to out-null isn't the same as casting to [void] or assigning to $null)

@Jaykul
Copy link

@Jaykul Jaykul commented Apr 17, 2019

I recommend people file new issues if they actually have feedback they want acted on -- since the team has clearly and unequivocally ruled out a not-backward-compatible version of PowerShell, I don't know why this thread just keeps going...

@jtmoree-github-com
Copy link

@jtmoree-github-com jtmoree-github-com commented Apr 17, 2019

since the team has clearly and unequivocally ruled out a not-backward-compatible version of

That is news to me. How is this clear? Please provide links to posts or other documentation.

It's a bit frustrating for some of us to hear "we'll never break compatibility" while we wrestle with some breakage every day. The point of this thread is so that the free market of ideas can solve some of these problems when upstream won't.

Someone might find it advantageous enough to fork powershell and create a version with minimal technical debt. That group will benefit from the ideas presented in this thread. (This is how the world works now. May the best powershell win.)

I'll reiterate that the team has already produced a 'not-backward compat' version of powershell by renaming the command from powershell to pwsh. Power(SHELL) is a shell. the job of a shell is to be the glue for humans that ties digital systems together. It's not a compiled binary with minimal external dependencies. Even traditional programming languages plan for and make breaking changes.

@jtmoree-github-com
Copy link

@jtmoree-github-com jtmoree-github-com commented Apr 17, 2019

POWERSHELL does this differently

It's bash that's doing it differently - return can only set the (integer) process exit code (that's not

I'm curious about other shells. What do they do? korn, csh, etc.

Here is an article discussing the return statement in multiple languages: https://en.wikipedia.org/wiki/Return_statement

It calls out that operating system [shells] allow for multiple things to be returned: return code and output.

@chriskuech
Copy link

@chriskuech chriskuech commented Jun 7, 2019

My team has a variety of scripts that only run in PowerShell 5 even though we use PowerShell 6 as much as possible. In my experience, the premise that PowerShell is completely backward-compatible is definitely false. There are at least some extreme cases (ex: ErrorActionPreference not behaving intuitively) that should definitely be addressed as a breaking change--cases where making the fix would be less "breaking" than not doing so.

@SteveL-MSFT
Copy link
Member

@SteveL-MSFT SteveL-MSFT commented Jul 5, 2019

@chriskuech is there an issue detailing your issue with ErrorActionPreference?

@jzabroski
Copy link

@jzabroski jzabroski commented Jul 5, 2019

@SteveL-MSFT I believe the Mosaic that @KirkMunro linked to directly below @chriskuech comments is the issue you are looking for. And yes, I squeeze the word Mosaic into a tech conversation.

That said, @iSazonov closed @chriskuech original issue on October 1, 2018: See #7774

It seems this stuff keeps coming up, in different forms, and the Committee keeps closing issues around it.

@chriskuech
Copy link

@chriskuech chriskuech commented Jul 6, 2019

@jzabroski found my main issue targeting the root cause.

I also filed an issue in the past around one of the symptoms: Invoke-WebRequest throwing a terminating error. I have personally witnessed multiple people completely dumbfounded around the whole try/catch boilerplate for handling failed HTTP requests, which occurs because the internal .NET methods throw terminating errors. In each of the three different cases, the engineer responded with an expletive when I explained the underlying behavior and why the issue would allegedly never be fixed.

To summarize, terminating errors terminate the script because PowerShell creators believe the script cannot logically proceed beyond the error, but I don't think that is literally ever anyone but the scripter's decision to make on a case-by-case basis. Only the scripter can decide if they want the script to Stop, SilentlyContinue, Continue, etc.

@chriskuech
Copy link

@chriskuech chriskuech commented Jul 6, 2019

(Tangential to the issue above)

If PowerShell implements an opt-in "ZeroTechDebt" mode, I definitely think that $ErrorActionPreference = "Stop" should be set by default. Obviously, this setting does not make sense in a REPL and should therefore not be set by default for all scenarios, but literally all of my scripts are prefixed with $ErrorActionPreference = "Stop" to enable "defensive programming" and behave like a "normal" programming language.

@KirkMunro
Copy link
Contributor

@KirkMunro KirkMunro commented Jul 6, 2019

@chriskuech If you haven't already, please give this collection of RFCs a look: PowerShell/PowerShell-RFC#187. They speak directly to what you're talking about here without needing a new zero tech debt version of PowerShell.

You can also find the four RFCs in separate issues on the Issues page in that repo if that makes them easier to read/digest. Just look for open issues posted by me and you'll find them

@jzabroski
Copy link

@jzabroski jzabroski commented Jul 9, 2019

@SteveL-MSFT Here is a similar issue that impedes my productivity. It's not $ErrorActionPreference but $ConfirmPreference:

Below is an ugly script I wrote for setting SQL Server disk volumes to 64kb.

Import-Module Storage;

function Format-Drives
{
    # See https://stackoverflow.com/a/42621174/1040437 (Formatting a disk using PowerShell without prompting for confirmation)
    $currentconfirm = $ConfirmPreference
    $ConfirmPreference = 'none'

    Get-Disk | Where isOffline | Set-Disk -isOffline $false
    # The next line of this script is (almost) copy-pasted verbatim from: https://blogs.technet.microsoft.com/heyscriptingguy/2013/05/29/use-powershell-to-initialize-raw-disks-and-to-partition-and-format-volumes/
    Get-Disk | Where partitionstyle -eq 'raw' | Initialize-Disk -PartitionStyle MBR -Confirm:$false -PassThru | New-Partition -AssignDriveLetter -UseMaximumSize -IsActive | Format-Volume -FileSystem NTFS -AllocationUnitSize 64kb -Confirm:$false

    # See https://stackoverflow.com/a/42621174/1040437 (Formatting a disk using PowerShell without prompting for confirmation)
    $ConfirmPreference = $currentconfirm
}

Format-Drives

Couple of side points:

  1. The documentation for Format-Volume is wanting for more. Only two examples? And the official documentation is not in sync with the website: MicrosoftDocs/windows-powershell-docs#1170
  2. Why do I need to go to StackOverflow to learn how to avoid a GUI for a scripting language?
  3. The fact this is so error prone/stupidly hard to get write "without thinking" just underlines related issues like PowerShell/PowerShell-RFC#198 - it's another example of how Bruce Payette's promise in "PowerShell In Action" that you just type what you think is right and it works... is completely and utterly false.
  4. See also @mklement0 's issue: #4568
@jzabroski
Copy link

@jzabroski jzabroski commented Jul 9, 2019

#Requires -Version has no way to specify a max version
See: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_requires?view=powershell-6

This is annoying when you are writing Advanced Functions that load .NET Framework libraries where the API is completely different between .NET Framework and .NET Core, such as how AccessControl API works.

@mburszley
Copy link

@mburszley mburszley commented Aug 23, 2019

@jzabroski You can specify the edition, however, to separate that:

#requires -PSEdition Desktop
# versus
#requires -PSEdition Core
@mklement0
Copy link
Contributor Author

@mklement0 mklement0 commented Jul 9, 2020

Just a quick note that @rjmholt has officially started a discussion about how to implement and manage breaking changes: #13129.

Plus, #6817 and #10967 are more behaviors worth revisiting once breaking changes are allowed.
(They have the same root cause, explained in #10967 (comment)).

@yecril71pl
Copy link
Contributor

@yecril71pl yecril71pl commented Jul 23, 2020

The fact that , is stronger than + is logical, as PowerShell is more about lists than about numbers. IMHO.

@rjmholt
Copy link
Member

@rjmholt rjmholt commented Jul 23, 2020

@rjmholt has officially started a discussion

I should say it's no more official than any other discussion

@cveld
Copy link

@cveld cveld commented Oct 21, 2020

I would love to have strict checking on imports and function signatures editing-time.
Are you considering to introduce new importing semantics such as EcmaScript modules provide?
No more global namespace polution. Slow import mechanics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
You can’t perform that action at this time.
X Tutup