proj-oot-ootCommandlineNotes1

---

" The shell file management features. This includes redirection and pipelines. This is trickier. Much of this can be done with subprocess. But some things that are easy in the shell are unpleasant in Python. Specifically stuff like (a

"
b; c )something >result. This runs two processes in parallel (with output of a as input to b), followed by a third process. The output from that sequence is run in parallel with something and the output is collected into a file named result. That's just complex to express in any other language.

---

" In UNIX, you’d have something like:

for f in *; do echo "Processing $f file..."; done

While in PowerShell?, you’d go with something similar to:

Get-ChildItem? "."

Foreach-Object { $name = $_.Name Write-Output "Processing $($name) file..." }

An equivalent functionality in Python can be achieved with:

from os import listdir

for f in listdir('.'): print('Processing {} file...'.format(f)) "

" $> cat names.txt

... To sort our list numerically, in descending order all we have to is $> cat names.txt ... if you’re using PowerShell?, cat is an alias for Get-Content and sort is an alias for Sort-Object, so the commands above can be also written as: $> cat names.txt "
namecount.py. And PowerShell? folks: $> Get-Content names.txt python namecount.py.
namecount.py sort -rn. And if you’re using PowerShell?: $> Get-Content names.txt python namecount.py Sort-Object { [int]$_.split()[-1] } -Descending
python namecount.py and $> Get-Content names.txt python namecount.py sort { [int]$_.split()[-1] } -Descending

"First off, the provided Unix commands _don't work_ (should have used `sort -rn -k 2`), while the provided powershell, as verbose as it is, does work."

" the entire python script along with the wrapping unix commands is encapsulated with a powershell 1-liner:

    cat ./names.txt | group | sort -d Count"

---

Ultimatt 1 day ago [-]

Skip awk, use perl....

The alias below sets perl to loop over STDIN splitting each line on more than one whitespace character and populate an array F. The -nE will then Evaluate an expression from the command line looping over the input line-by-line.

    alias glorp='perl -aF"/\s+/" -nE'

So now we have the command `glorp` to play with which has more familiar syntax than awk and all of CPAN available to play with!

    $ [data is generated] | glorp '/Something/ and say $F[2]'

We have access to any Perl module by putting -MModule::Name=function after the command, the following will parse a JSON record per line and glorp out what we wanted:

    $ echo -e '{"hello":"world"}\n{"hello":"cat"}' | glorp 'say decode_json($_)->{hello};' -MJSON=decode_json
    world
    cat

Maybe you are used to using curl too. There is a nice web framework in Perl called Mojolicious (http://mojolicious.org) that provides a convenience module called 'ojo' for command line use. So grabbing the summary sentence from Wikipedia articles is as straight forward as below. Notice Mojolicious lets us use CSS selectors!

    $ echo -e 'grep\nawk\nperl' \
      | glorp 'say g("wikipedia.org/wiki/$F[0]")->dom->at("#mw-content-text > div > p")->all_text' -Mojo

reply

vidarh 1 day ago [-]

Here's the equivalent for Ruby:

    alias glorp='ruby -ane '
    $ [data is generated] | glorp ' ~ /Something/ and puts $F[2]'

Or:

    $ echo -e '{"hello":"world"}\n{"hello":"cat"}' | glorp 'puts JSON.load($_)["hello"] ' -rjson

(Of course Ruby got the -a autosplit-mode and the -n assumed 'while gets(); ... end' loop from Perl along with $_ and $F, so it's very intentional that they're similar)

reply

kolodny 1 day ago [-]

Somewhat related nodejs self plug: Use nip https://github.com/kolodny/nip

    $  echo -e 'this\nis\na\nwhatever foo' | nip 'return /whatever/.test(line) && cols[1]' # foo

reply

Ultimatt 1 day ago [-]

Awesome thanks for sharing this! I was too lazy to give a Ruby example alongside.

reply

jedisct1 1 day ago [-]

And Ruby regexes are amazing.

reply

omaranto 1 day ago [-]

I thought that they were amazing because they were just like Perl's. Are there any differences?

reply

---

great article about some neat things in bash that we can probably learn from:

https://zwischenzugs.com/2018/01/06/ten-things-i-wish-id-known-about-bash/

---

 jordigh 8 hours ago [-]

Using readline is a great thing to know about too.

My favourite little-known readline command is operate-and-get-next:

https://www.gnu.org/software/bash/manual/html_node/Miscellan...

You can use it to search back in history with C-r and then execute that command with C-o and keep pressing C-o to execute the commands that followed that one in history. Very helpful for executing a whole block of history.

For some reason, this documentation is hard to find! It's not here, for example:

http://readline.kablamo.org/emacs.html

I'm a bit saddened when readline replacements don't implement C-o. For example, the Python REPLs don't have it.

reply

lillesvin 6 hours ago [-]

I've overridden ctrl-r in my local Bash to search with fzf[0] and I'm using my history so much more now.

Didn't know about ctrl-o though, it sounds great! I hope that my ctrl-r override doesn't somehow break it.

[0]: https://github.com/junegunn/fzf

E: Fixed link.

reply

CaptSpify? 5 hours ago [-]

That link is a 404 for me?

reply

tokenizerrr 5 hours ago [-]

It had a trailing >, https://github.com/junegunn/fzf

reply

---

chriswarbo 54 minutes ago [-]

For me, the biggest gotcha in bash is whether or not a sub-process/shell will be invoked, which can affect things like mutable variables and the number of open file handles. For example:

    COUNT=0
    someCommand | while read -r LINE
                  do
                    COUNT=$(( COUNT + 1 ))
                  done
    echo "$COUNT"

This will always print `0`, since the `COUNT=` line will be run in a sub-process due to the pipe, and hence it can't mutate the outer-process's `COUNT` variable. The following will count as expected, since the `<()` causes `someCommand` to run in a sub-process instead:

    COUNT=0
    while read -r LINE
    do
      COUNT=$(( COUNT + 1 ))
    done < <(someCommand)
    echo "$COUNT"

Another issue I ran into is `$()` exit codes being ignored when spliced into strings. For example, if `someCommand` errors-out then so will this:

    set -e
    FOO=$(someCommand)
    BAR="pre $FOO post"

Yet this will fail silently:

    set -e
    BAR="pre $(someCommand) post"

reply

---

bash:

  parameter     result
  -----------   ------------------------------
  $name         polish.ostrich.racing.champion
  ${name#*.}           ostrich.racing.champion
  ${name##*.}                         champion
  ${name%%.*}   polish
  ${name%.*}    polish.ostrich.racing
  ${name%.*.*}  polish.ostrich
  ${name#*.*.}                 racing.champion
  [1]

---

lvillani 8 hours ago [-]

shellcheck (https://www.shellcheck.net) is an absolute godsend when writing Bash/POSIX sh scripts.

It catches so many errors that I think it's a must have in every programmer's toolbox. It even catches bash-isms when you are targeting POSIX sh. It saved me many many hours of grief trying to debug shell scripts I wrote and changed the way I write them for the better.

reply

aequitas 6 hours ago [-]

This is also my number one thing I wish I'd known about Bash. It saves on so many trivial bugs.

The documentation is especially great, for ever problem it detects you get an unique reference which you can lookup on the wiki eg: https://github.com/koalaman/shellcheck/wiki/SC2086 It then not only describes the problem but also shows different ways of solving it with some great examples and reasoning. I think I learned more Bash from Shellcheck than anywhere else.

reply

---

tutorial on Bourne shell:

http://www.grymoire.com/Unix/Sh.html

---

thristian 2377 days ago

parent favorite on: Plan9 has been forked: 9front

After having some experience trying to write portable Bourne Shell scripts, I recently came across Tom Duff's paper[1] describing the "rc" shell used by Plan 9. I was quite blown away by the number of common shell-scripting problems it solves, and the elegance of the resulting language. There's no chance of it replacing, say, Python, but I'm sorely tempted to install Plan 9 From User Space and try out rc the next time I need to script something.

[1]: http://doc.cat-v.org/plan_9/4th_edition/papers/rc

p9idf 2377 days ago [-]

The main feature that Rc has which Bash lacks is simplicity. Bash's man page is twice the size of Rc's entire source code.

sigil 2376 days ago [-]

I'm not an rc expert, merely a disgruntled POSIX shell scripter, so here are some attractive rc features [1] missing from sh and bash:

> Rc captures command exit status in the variable $status. For a simple command the value of $status is just as described above. For a pipeline $status is set to the concatenation of the statuses of the pipeline components with

characters for separators.

Both bash and ksh have the pipefail option, which at least makes the pipeline fail if any component fails. (Horrifyingly, POSIX sh lacks even this.) Actually seeing which component failed can be really handy. In other shells, you only get $?.

> Arbitrary file descriptors may be sent through a pipe by typing, for example, "vc junk.c

[2] grep -v ’^$’" This deletes blank lines from the C compiler’s error output.

POSIX sh, bash, and ksh can only connect stdout to stdin across a pipe. Connecting arbitrary descriptors is nice, particularly in logging applications

> There is no need for the distinction between $* and $@. There is no need for four types of quotation, nor the extremely complicated rules that govern them.

How many people understand the difference between $* and $@ in POSIX sh or bash, or for instance the difference between "$@" and $@? These are critical things any POSIX shell scripter should know, yet I've met very few that actually do. One approach is to educate people about these things. Another is to avoid the need to educate them in the first place.

I like that rc cleans up quoting, which I've seen single-handedly scare developers off of shell scripting.

http://doc.cat-v.org/plan_9/4th_edition/papers/rc

anon_d 2376 days ago [-]

rc solves problems with sh by replacing annoying/problematic features. bash adds the same features, but keeps all the problems with sh. Read the "Design Principles" section for specific examples.

p9idf 2377 days ago [-]

Erik Quanstrom has a version which fixes some of Rc's faults, like having no return statement and the weird ifs behaviour.

uriel 2377 days ago [-]

The lack of return statements is rarely missed, although it could be handy in some cases, there are ways around it.

There is nothing weird about ifs' behavior once you understand it.

Byron's rc tried to 'fix' ifs, and made everything worse.

 uriel 2377 days ago [-]

rc is a truly beautiful thing.

I wrote a pretty complex web framework and CMS with it ( http://werc.cat-v.org ), and the more I used it, the more I loved it.

I found it to be extremely well crafted, with every feature fitting beautifully with the rest, it misses many fancy things people expect, but once you know your way around what it has, you can do almost anything.

Like awk, is one of those few languages that you can fully keep in your head, and that plus the power of calling other Unix/Plan 9 commands makes it very effective.

---

this sounds very interesting:

http://www.oilshell.org/blog/2018/01/28.html

ComputerGuru? 1 day ago [-]

Lots of overlap in design goals with fish, except fish also places a premium on users interactively using the shell (which means friendlier in-repl experience but a balancing act when it comes to features). Fish’ auto completions are incredible, too.

Best of luck to them. Another interesting shell to check out is elvish, lots of new ideas there (even if awkward to use).

(Disclosure: I’m one of the core fish devs/maintainers. Edit: The entire team is awesome and the others deserve virtually all the credit!)

reply

oelmekki 1 day ago [-]

I totally love fish, it's the first shell that made me replace bash as my default shell (zsh was "just not enough" to justify losing compatibility, for me).

But is this project really overlapping it? I see mostly fish as "UX" centered, while oilshell - as far as it's stated here - reminds me more of powershell (I found the idea of having a modern language behind it so cool when it was released, too bad it wasn't my ecosystem): looks like oilshell is targetting scripting more than UX.

laumars 1 day ago [-]

Shameless self promotion but I'm writing my own shell, murex, as well[1]

The goals of mine are akin to Fish in terms of REPL use but with a greater emphasis on scripting.

Like Fish, murex also does man page parsing (in fact I wrote mine before realising Fish did the same), but unlike fish autocompletions can be defined by a flat JSON file (much like Terraform) as well as dynamically with code.

Currently I'm working on murexes event system so you can have the shell trigger code upon events like file system changes.

My ultimate aim is to make murex a goto tool systems administration tool - if just for myself - but the project is still young

[1] https://github.com/lmorg/murex

reply

djsumdog 1 day ago [-]

I messed around with oil earlier this week when I saw this posted elsewhere and I've been a fish user for about 4 years now.

I'll have to give yours a spin too. I'm glad there's a lot of shell innovation right now. I'm all for breaking posix shell standards and creating things that are way more usable. Fish's prompt customization, functions, highlighting, completion and searching are pretty amazing.

I realize a lot of these little projects will come and go. No matter what, they're great learning tools for the creators/developers, exploring what it takes to make an interactive shell.

Still, I hope we see more stuff like fish come out (and make no mistake, fish took a lot of years and a lot of devs. In the early days my instance would crash every once in a while in ways I couldn't easily reproduce). It's great that we're finally getting away from the traditional bash/zsh/ksh stuff and into newer shells that make coding and navigation easier.

reply

adrianratnapala 19 hours ago [-]

Well I only started reading about Oil just now, but actually it seems to be the exact complement of Fish.

That is, Fish intends to be a useful interactive shell and if it also scriptable, that because you need scripting for it to be useful. Fish doesn't make a serious attempt being the language in which system scripts are programmed in.

Oil on the other hand is a concerted effort to formalise and rigorously implement the language which existing system scripts are written in. The author believes that is a starting point for a good interactive shell -- but programming comes first.

reply

zokier 1 day ago [-]

> Another interesting shell to check out is elvish, lots of new ideas there (even if awkward to use).

Elvish is pretty nifty, but the biggest failing point to me is that the fancy rich pipelines really work just for in-process stuff, which for me kinda loses the point of being a shell. Of course I do realize that rich (polyglot) interprocess pipelines is a difficult problem; some might say a pipedream.

reply

xiaq 19 hours ago [-]

Interprocess pipeline is a difficult problem only because there is no standard encoding. Elvish has pairs of builtin commands like from-json and to-json; so if your command takes JSON and writes JSON, you can use it in a pipeline like:

    ... | to-json | your-command | from-json | ...

It is also trivial to wrap it into a function like:

    fn f { to-json | your-command | from-json }

reply

watersb 13 hours ago [-]

PowerShell? on Windows is the only production system that I have used which passes "objects" through pipes.

It's a challenge for me to use well; not sure all that richness is composable. Better programmers than I am would know.

reply

jhillyerd 1 day ago [-]

After using fish for 3 years, I'm finding there is very little reason to have my login shell maintain backwards compatibility with bash.

The only time I run into issues is when a command expects manipulate environment variables via bash syntax.

I think the fish documentation WRT to scripting could be much better, but the language is more elegant than bash or PowerShell? IMHO.

reply

jernfrost 1 day ago [-]

Interesting seeing so many fish fans. I absolutely love fish. It makes my everyday shell usage so much nicer. But it seems like a totally unknown shell to most people. I never see anybody else use it at any job I've had.

I did use fish a bit as a script language, but I decided for anything of any size I much prefer Julia. For typical file system navigation, fish is better, but Julia is actually pretty decent as a shell, despite being a real language. So writing shell scripts in it is pretty nice.

In the beginning I wrote separate programs executed from fish shell. But now I just fire up Julia as a shell and run functions directly there interactively.

reply

dilap 16 hours ago [-]

I remember a few years ago poking at julia and thinking it would make a really good shell language. The thing that killed it for this use at the time was slow startup; is that better now?

reply

Sean1708 5 hours ago [-]

Much better, it's certainly worth giving it another go. It's still much slower than Python, but it's quick enough that I don't notice it all.

  $ time julia -e 'println("Hi")'
  Hi
  real    0m0.241s
  user    0m0.216s
  sys     0m0.196s
  $ time python3 -c 'print("Hi")'
  Hi
  
  real    0m0.046s
  user    0m0.020s
  sys     0m0.000s

reply

...

Rash also tries to be better than the competition by adding object pipelines (much like Powershell, it makes it much more reasonable to make system administration commands in the host language, and have rich interaction and inspection of command results), user-defineable pipeline operators, and generally tighter integration with the host language while still having a light syntax for basic commands.

...

chubot 16 hours ago [-]

OK great, glad you have made use of the page. We had an "alternative shells" thread about a year ago between the authors of Elvish, NGS, Oh, and mash. Those were the main active/nascent shells I could find.

It might be time to start that again to see what ideas people have and what has changed. If you're interested e-mail me at andy@oilshell.org.

I'm also interested in the Shill shell, which as I understand was written in Racket, and then somehow they moved away from Racket? I'm not sure. I think it was because of the runtime. I also saw some efforts to move Racket to Chez Scheme.

I very much like Racket as an idea -- a meta language -- but I haven't gotten a chance to play with it too much.

And I did experiment with femtolisp as a basis for Oil -- the lisp used to bootstrap Julia -- but I decided against it.

reply

---

in praise of 'fish' over 'bash':

h1d 14 hours ago

parent flag favorite on: Safe ways to do things in bash

I think it's time people start using something better than bash/zsh that is decades old, like fish or even come up with a more modern shell.

Even by looking at these examples, you see it has less verbosity like "then" and "do", you can reference arguments as $argv instead of cryptic $@ and exit status code as $status instead of $? which is confusing with $! and the likes.

https://blog.codeship.com/lets-talk-about-shell-scripting/

https://fishshell.com/docs/current/tutorial.html

Xcelerate 21 hours ago

parent flag favorite on: Safe ways to do things in bash

I always think — when a programming/scripting language requires this much bizarre knowledge just to write basic code that performs basic tasks, perhaps it is time for that language to be retired.

I really don't understand why bash still exists. I've switched to fish and am much happier with the change.

and in praise of 'oil':

Sean1708 12 hours ago [-]

You might find the Oil Blog[0] interesting, there are a lot of interesting thoughts there about what a truly modern shell might look like.

[0]: https://www.oilshell.org/blog/

reply

h1d 11 hours ago [-]

Great read, thanks.

reply

---

"4. Bash isn't that bad once you get used to it (seriously. I'll grant you tho that arrays are still nasty ;-) )." [2]

---

nodesocket 1 day ago

parent flag favorite on: Safe ways to do things in bash

Highly recommend shellcheck[1]. There is a SublimeLinter? plugin[2] that automatically checks your shell scripts as you code them. It generally makes best practice suggestions including quoting.

[1] https://github.com/koalaman/shellcheck

[2] https://github.com/SublimeLinter/SublimeLinter-shellcheck

---

[3]

Shell is the right language for tasks that primarily involve running lots of subprocesses, whether they are simple or complex. If a small part of your task needs functionality that can't be done well in shell, fortunately, shell is very good at running subprocesses, and it's a perfectly reasonable approach to do something like this:

    foo () {
        python3 -c 'import sys, foo; print(foo.bar(sys.argv[1:]))' "$@"
    }
    a="$(foo "$baz" "$quux")"

I regularly do this with the requests and json modules in particular, because being an HTTP client or a JSON parser is not a thing shell is good at. (For the specific problem of manipulating JSON, jq is another fine option if you have it installed.)

---

coandco 16 hours ago [-]

I'd recommend checking out Plumbum (https://plumbum.readthedocs.io) -- at the very least, it has a solid base for easily setting up pipelines, input/output redirection, and signal handling.

reply

---

https://github.com/anordal/shellharden/blob/master/how_to_do_things_safely_in_bash.md

---

remember to look at perl, oil shell (for shell ideas), fish shell (for shell ideas), icon (for string processing), Plumbum, pyped.

---

http://www.oilshell.org/blog/2018/01/28.html#limit-to-posix

---

commandline argparse

http://click.pocoo.org/5/

---

[4]

---

emmanueloga_ 11 hours ago [-]

Don't get me wrong! I love command line apps. But I wonder if we all have a bit of an Stockholm syndrome... there are several things that suck about them...

While writing this I'm thinking on my experience trying to do anything with ffmpeg or imagemagick... or even find.

Is there a better way? Not sure. I like powershell a bit but some of the things I mention above still apply.

I think we may be able to get a workflow that is a bit closer to the tooling we use for writing programs while not being perceived as verbose and heavy (I'm thinking, the kind of workflow I get with a Clojure repl).

reply

---

lready__Taken 8 hours ago [-]

Powershell made 2 things gospel and I'm just sad it's going to take 20 years for other operating systems to realize that's the way to do it and something that replaces linux to see it in practice (i.e. it just won't)

1. Auto-complete is by-design part of the language/shell

1. Pre-approved Verb list that prefix commands helps discoverability and usability no end. Still learning? Get-<anything> will literally never cause a problem.

Because of the above get-help <function> can automatically create some fairly useful documentation right away and whoever wrote the code didn't need to do anything. They can significantly add to to help though. Because that mechanism is part of the language it's worth doing literally every user is going to access it, not some blog your wrote 5 years ago I hope is still online.

Passing objects rather than string, whatever maybe it's not the best way, I think it's great. But That's not the only thing PS has to learn from.

If anyone uses the cli and doesn't know how the PS help system works, it's certainly a breath of fresh air to learn.

reply

agumonkey 6 hours ago [-]

I never really found the verb list to be entirely satisfactory. It's great that they gave boundaries but I find it too verbose. There are other kinds of syntactic ergonomics with concise vocabulary (lisp has ! xxx p$ for instance, they're a bit harder to swallow but I find the code a bit more poetic and easier to remember as a pattern).

About the objects vs strings, Kalman Reti (of Symbolics IIRC) talked [1] about how an OS passing pointers could enjoy a much easier time instead of serializing everything as strings then deserializing, especially when done ad-hoc through sed/grep/perl/whatever .. It pains me to see this. It pains me to see how linux basic utils are 30% --usage, 30% output formatting (and they all share this).

MS did a great thing with PS.

[1] https://www.youtube.com/watch?v=o4-YnLpLgtk

---

[5]

" CLIs need to start quickly. Use $ time mycli to benchmark your CLI. Here is a rough guide:

    <100ms: very fast (sadly, not feasible for scripting languages)
    100ms–500ms: fast enough, aim here
    500ms-2s: usable, but not going to impress anyone
    2s+: languid, users will prefer to avoid your CLI at this point"

" 12. Follow XDG-spec

XDG-spec is a great standard that should be used to find out where to put files. Unless environment variables like XDG_CONFIG_HOME say otherwise, use ~/.config/myapp for config files, and ~/.local/share/myapp for data files.

For cache files though, use ~/.cache/myapp on Unix but on MacOS? it’s better to default to ~/Library/Caches/myapp. On Windows you can use %LOCALAPPDATA%\myapp. "

---

[6]

many of the following are direct quotes, even if unquoted

1. provide help (mycli, mycli --help, etc) 2. Prefer flags to args 3. mycli --version 4. use stderr so that user can pipe output. "If you run a subcommand in your CLI, make sure you pipe the stderr of that subcommand up to the user always. This way any issues are surfaced ultimately to the user’s screen." 5. A great error message should contain: Error code (may be alphabetic eg EPERM), Error title, Error description, How to fix the error, URL for more information -- note that this advice is controversial; some would prefer all errors to fit on one line, for easier grepping 6. Be fancy: "Use colors/dimming to highlight important information. Use spinners and progress bars to show long-running tasks to tell the user you’re still working. Leverage OS notifications when a very long-running task is done." and Checkboxes and radio buttons, but check the user’s stdout is connected to a tty first (otherwise they may be piping to a file) and respect TERM=dumb, NO_COLOR [7], --no-color, etc. 7. "For accepting input, if stdin is not a tty then prompt rather than forcing the user to specify a flag". Also prompt to confirm dangerous commands; maybe even requiring the user to retype something rather than just y/n. -- note that this advice is controversal; many users would rather than a cli running in a non-ttp environment should fail with an error if it needs more information, or if it needs a confirmation flag for a dangerous action 8. Use tables. "Note that cli.table() from cli-ux@5 allows you to easily create tables following these principles:"

they also provide a library, oclif

---

davemp 5 hours ago [-]

I find PS tab complete to be sub par compared to fish. It’s too aggressive when there are multiple completions.

reply

felixfbecker 3 hours ago [-]

What do you mean by aggressive?

reply

Klathmon 3 hours ago [-]

I don't know what they mean, but I know that I really dislike how PS's autocomplete works.

If the possible commands are (for the sake of discussion) `Get-AppLockerFileInformation?` and `Get-AppLockerPolicy?`.

If you type `Get-App` and hit Tab, it will autofill to `Get-AppLockerFileInformation?` which I just really don't like, and I need to keep hitting tab to cycle through all other possible `Get-App*` commands.

What I want is some UI that when I hit tab it autocompletes as much as possible until there's a decision to make, and then shows me the options. So in my example case, it would look something like this:

Type `Get-App` -> hit tab -> shows `Get-AppLocker?` -> type "P" -> hit tab -> `Get-AppLockerPolicy?` is displayed

bash-ish systems nail this, and they even have the "double tab" to list all options from that point which is normally pretty nice.

The PS idea of "keep hitting tab or shift-tab while you cycle through all possible options" sucks in comparison. Especially with discoverability (There are about 20 commands on my windows system that start with "Get-App", and in order to figure them all out I just need to keep hitting tab until it cycles around)

reply

lzybkr 2 hours ago [-]

With PSReadLine?, you can change that:

  Set-PSReadLineOption -EditMode Emacs

Or

  Set-PSReadLineKeyHandler -Key Tab -Function Complete

reply

Klathmon 1 hour ago [-]

The first one seems to change it how I want, but the second doesn't. And it doesn't stick around after closing the powershell prompt.

---

how to easily make a manpage:

"Just using pandoc to convert your, say, README.md gives good-enough results:

pandoc -s -f markdown_github -t man -o your_cli.1 README.md " [8]

---

" I myself write doco in Docbook XML

generate HTML from that that can be read directly

    xdg-open /usr/local/share/doc/nosh/machineenv.html 

or on the WWW

and generate roff for man pages from it as well

    man machineenv

Whereas --help output is auto-generated from the option definitions given to the command-line parser, much like libpopt does. " [9]

---

 OJFord 10 hours ago [-]

> 12. Follow XDG-spec

I'm so glad to see this included. I don't like $HOME being cluttered with .<app> config directories, but worse than that, far too many when releasing on macOS say Oh Library/Application\ Support/<app>/vom/something is the standard config location on Mac, so I'll respect XDG on Linux but on Mac it should go there. No! Such an unfriendly location for editable config files.

reply

JdeBP? 2 hours ago [-]

Erroneously naming it "XDG-spec" is not very good, though. It is the "XDG Base Directory Specification", just one of several XDG specs.

reply

---

 twic 57 minutes ago [-]

> 8. Use tables

> By keeping each row to a single entry, you can do things like pipe to wc to get the count of lines, or grep to filter each line

> Allow output in csv or json.

Yes please. Default to readable-but-shellable tabular output, and support other formats.

libxo from the BSD world is a really smart idea - it provides an API that programs can use to emit data, with implementations for text, XML, JSON, and HTML:

http://juniper.github.io/libxo/libxo-manual.html

I personally love CSV output. Something like libxo means that CSV output could be added to every program in the system in one fell swoop.

reply

---

dotancohen 3 hours ago [-]

> My own pet peeve is un-disablable stdout logging.

I really think that we need a stdmeta file descriptor. See my comments at the link below, I would appreciate your feedback:

https://unix.stackexchange.com/questions/197809/propose-additional-file-descriptor-stdmeta reply

TeMPOraL? 1 hour ago [-]

I like the idea in Common Lisp - beyond stdin, stdout and stderr, it also specifies bidirectional streams for general interactivity (querying users for data and accepting their input), interactive debugging and an output stream for tracing&timing code. See: http://www.lispworks.com/documentation/lw61/CLHS/Body/v_debug_.htm

Unfortunately, if using CL to deploy a CLI app for modern systems, all of this has to be shoehorned into the usual stdin/stdout/stderr split.

reply

---

more on the stdmeta idea:

" However, I see real use for a "stdmeta" file descriptor that would be supported by common CLI tools. This file descriptor would output lines that are not part of the data yet are not errors. How might one go about proposing such an addition to Unix or Linux?

One example usage would be to output the headers of commands such as ps:

$ ps PID TTY TIME CMD 6394 pts/15 00:00:00 bash 10294 pts/15 00:00:00 ps 10295 pts/15 00:00:00 bash "

" Another example is the output of curl. Curl pipes metadata to stderr so that it can inform humans of the progress without adding complexity to applications to which might be piped its output. Again, this is the use of stderr as an substitute for the missing stdmeta. "

to me this suggests a 'stdmeta' and a 'progress' output

---

twic 1 hour ago [-]

I am one of the handful of people to use a white background in my terminals. A significant amount of coloured output is illegible to me.

I would suggest these rules for using default coloured output:

1. Don't.

2. Really, don't. Bold is fine, though!

3. (experts only) Make sure the colour scheme works with white-on-black, black-on-white, white-on-navy (for powershell), and Monokai/Solarized/whatever the flavour of the month for insecure hipsters is.

If you use colours, by default or not, make it really easy to configure the colours, so people can make it work with their terminal's colour scheme.

reply

---

" Creating a phone directory with that affordance is a lot of work today. However user interfaces are different today. 80s DOS text mode only had a keyboard input, a 16-color palette, 80 columns over 25 rows, and just text, text, text. Modern web UI is event-driven, has millions of colors, works across screen sizes, and is networked by default.

Though with this essential complexity, there is so much accidental complexity. It shouldn't be this hard to build software. The essence of a double-entry bookkeeping software 30 years ago is the same as one we build today. A new framework is probably not the answer. New languages (or old giants) and formal systems might be. Ballerina, Unison, Dark, Ur/Web, Elm, Reason and so on. "

---

" gavinpc 1 day ago [-]

You know, I also got my (official) start in an xBASE language (FoxPro?), and while I am obsessed with the history of inventors like Englebart and his successors, those xBASE environments had more going for them than I tend to remember.

"Ordinary" people could use those tools to make useful things. We seem to have gone backwards from there.

What was different about those systems?

They were integrated usage environments. Batteries included. In FoxPro?, you got:

This was first and foremost a usable environment immediately. "Development" was an advanced usage.

Emacs is this way. Smalltalk is this way. What they get is that programming languages are useless by themselves.

Contrast with now: the apps we build involve wiring together a database from here, a compiler from there, just figuring out how you're going to make pieces communicate and finally, you know, get something on the screen. Sure, there's an upside to composing systems a la carte. But the learning curve is far more prohibitive.

(Also, those xBASE systems just will not die. Just this morning I got an email question about a system that's been in service for 25 years now. How many of today's apps do you think will still be used (indeed, sold!) in 2043?)

reply

pjmlp 1 day ago [-]

You can still get close to this experience on Android, .NET, Java, Cocoa.

Because as you say, grammar and semantics (aka language) are not enough.

Hence why when I analyse how suitable a programming language is for a given project, I always consider the whole ecosystem, and not how easy it is do FP, immutability or whatever is cool this year.

reply " ---

" In the talk, he gives several demonstrations a key aspect of why unix pipelines are so practically useful: you build them interactively. A complicated 4 line pipeline started as a single command that was gradually refined into something that actually solves a complicated problem. This talk demonstrates the part that isn't included in the the usual tutorials or "cool 1-line command" lists: the cycle of "Try something. Hit up to get the command back. Make one iterative change and try again." " -- pdkl95

---

shell

http://www.jonathanturner.org/2019/08/introducing-nushell.html

---

tlhunter 1 day ago [-]

I've been a die-hard Linux user for about a dozen years. Recently I had to do some development with MS Powershell. I was very reluctant at first, but after getting familiar with the technology, I almost fell in love.

"Cmdlets", basically commands used in Powershell, output "objects" instead of the streams of text used in a more classical shell. Powershell has built-in tools to work with these objects. For example, you can take the output from one Cmdlet, pipe it through `SELECT` with a list of fields specified, and get a stream of objects only containing those fields. Other operations can be performed against those objects as well, such as filtering and whatnot.

Back to normal nix commands, we're starting to see more and more commands introduce direct JSON support [1]. There are even tools to translate output from common commands into JSON [2]. We'll probably see `jq` shipped directly with modern distros soon. Eventually we'll reach a tipping point where it's expected that command supports JSON output. Tools like `awk`/`sed` might get updated to have a richer support for JSON. Finally, we'll have ubiquitous Powershell-like capabilities on every nix machine.

Powershell _is_ available on Linux. The model of piping objects instead of JSON is both powerful and more efficient (For example, there's no redundant keys like in a stream of JSON objects, leading to less moving bytes, like how CSV headers aren't repeated with every row. Plus, binary data is smaller than text.) But, most developers are hesitant to switch out their shell and existing workflows for a completely new tool, which is why Powershell will likely only be adopted by a small subset of sysadmins.

[1] https://daniel.haxx.se/blog/2020/03/17/curl-write-out-json/

[2] https://github.com/kellyjonbrazil/jc

reply

Freaky 1 day ago [-]

Though it's pretty immature, nushell has a similar idea, with its own internal data model being streams of structured, typed data: https://www.nushell.sh/

And back to nix commands, libxo is used by a chunk of the FreeBSD? base tools to offer output in JSON, amongst other things: https://github.com/Juniper/libxo

    -% ps --libxo=json,pretty
    {
      "process-information": {
        "process": [
          {
            "pid": "52455",
            "terminal-name": "5 ",
            "state": "Is",
            "cpu-time": "0:00.00",
            "command": "-sh (sh)"
          },
    -% uptime --libxo=json,pretty
    {
      "uptime-information": {
        "time-of-day": " 8:34p.m.",
        "uptime": 1730360,
        "days": 20,

Be nice to see more tools converted.

reply

gerdesj 1 day ago [-]

  1. ip -j a
jq

reply

majkinetor 1 day ago [-]

Not only that, but pwsh support for objects doesn't stop on passing objects around and mapping to properties to parameters. There are number of mehanisms in place. All nix variants solve just 1 of those mechanisms.

IMO, powershell should be added to ALL mainstream distros as first order citizen. There is no downside to that given that MS is now legit FOSS player and that anybody can fork in case something goes wrong along the way...

reply

---

mb 16 colors (4-bit color), like templeos, is a bit limiting. At the least we want 8-bit color (web-safe colors; see https://en.wikipedia.org/wiki/Color_depth#List_of_common_depths ).

I think we want 16 bits total per pixel: we can do 4 bits per each of RGB plus 4 bits of brightness. I think this means that the colors that we can represent are a subset of the 24-bit colors, and we can reach all of the 24-bit gray colors, and we can reach many of other 24-bit colors.

---

as Terry Davis of TempleOS? said about his 640x480 16 color graphics: "Also, that's all I feel comfortable with without GPU acceleration supported. A 1600x1200x24 bit scrn takes 37 times more memory, implying 37 times the CPU power. Also, a fixed size keeps it simple with everybody machine having the same appearance."

i think 640x480x4bit is a little low, what would i feel comfortable with if i was choosing a fixed size?

Maybe 720p, 12bit graphics (so, really, you'd end up rounding up to 16 bits per pixel). This is the lowest somewhat common resolution found on https://en.wikipedia.org/wiki/Display_resolution#Common_display_resolutions , at 4.89% on StatCounter?. So:

1280x720 pixels times 2 bytes per pixel = 1843200 bytes (about 1.8 MiB?)

contrast 640x480x0.5 = 640*480/2. = 153600.0 =~ 153.6 kB

1843200/153600.0 = 12x

if we could accept 8 bit color, that would go down to 6x. if we could accept 8 bit color and SVGA (800 x 600), we'd have 480000,

480000/153600.0 = 3.125

so maybe 720p, 8 bit color is the way to go. The 8bit color could be a straight color cube with the last 2 bits being extra bits for brightness (giving 4x the grayscale levels, for a total of 16 grayscale levels).

Or, if more colors at the expense of less grayscale levels is desired, could use the web-safe colors (a 6x6x6 cube): https://en.wikipedia.org/wiki/Web_colors#Web-safe_colors -- these are only 216 colors, leaving 40 spots for more grays, as noted by https://en.wikipedia.org/wiki/Color_depth

A much more common resolution is 1366x768, according to https://en.wikipedia.org/wiki/Display_resolution#Common_display_resolutions. With 16 bits that would be:

1366*768*2 = 2098176 2098176/153600.0 = 13.66

so not much worse than 1280x780, so probably worth it because it's more common.

I like 16 bit color, direct color, with the last 4 bits being an additional brightness multiplier, because:

otoh 5 bits of RB and 6 bits of G is probably much better since to convert to the more common 24 bit color you wouldn't have to do a multiplication, just bit shifts; the disadvantage is that you only have 32 grays rather than 256.

Based on the examples on the page https://en.wikipedia.org/wiki/Color_depth (in June 2020), i'd say that 8-bit color is good enough/almost good enough, so adding another 4 bits should be good enough.

https://www.japanistry.com/bit-depth/ shows that a gradient of 16 grayscale levels shows notable jumps, however a street photo image with 16 grayscale levels is good enough. It also shows a full-color image with 8-bit color with a lot of blue sky and heavy posterization in the sky. https://petapixel.com/2018/09/19/8-12-14-vs-16-bit-depth-what-do-you-really-need/ suggests that 8 bits PER CHANNEL is needed (so 24 bits), or 12 bits per channel if you are going to postprocess. https://hamamatsu.magnet.fsu.edu/articles/digitalimagebasics.html figure 6 suggests that 5-bit grayscale is good enough (although not perfect). The grayscale pic at the end of https://lowendmac.com/2018/millions-of-colors-vs-thousands-whats-the-difference/ suggests that 6-bit grayscale is good enough (so i bet 5-bit grayscale is too).

https://lowendmac.com/2018/millions-of-colors-vs-thousands-whats-the-difference/ notes that Apple actually used 15-bit (5,5,5) rather than 16-bit (5,6,5) color to simplify things in the Mac LC. There is an example sunset in 15 bit color and it looks good enough to me.

So overall i think 15 bits (5,5,5 bit RGB) direct color is 'good enough'.

So overall i would choose 1366x768x15 as a compromise between:

640x480x4 1920x1080x24

i don't think SVGA (800x600) is enough of an improvement over VGA to be worth it, but just throwing that in there, some valid choices are:

640x480x4 (palette) 800x600x8 (palette?) 1366x768x15 (direct color) 1920x1080x24 (direct color)

640x480x4/palette is the smallest reasonable choice (TempleOS? uses this; it's palette: https://www.reddit.com/r/TempleOS_Official/comments/cgmd91/playing_around_with_the_color_pallet_in_templeos/ ). 1920x1080x24/direct is the 'simplest' in terms of definitely supporting anything common today.

I don't think we can assume that every display that Oot might be run on supports 1920x1080x24/direct, or even 1366x768x15/direct. I think we can assume that every display supports 640x480x4/palette (if it's less than that, it's reasonable to make the programmer not use our TUI stuff and use the platform-dependent primitives instead).

Therefore, either we support only 640x480x4/palette, or we support anything up to 1920x1080x24 (so we have to support multiple resolutions). If we were making hardware or an OS like TempleOS?, i see the benefit of forcing a single choice of 640x480x4/palette, so that people can write programs that look the same on every instance, but in our case i think flexibility is more important, so it seems to me that we may as well support 1920x1080x24. We should probably also support lower color depths and maybe even palettes, so as to allow embedded devices to have some TUI without using 32 bits per pixel.

---

https://github.com/alfiopuglisi/guietta

raghavtoshniwal 1 day ago [–]

This is so timely! I am teaching CS to some high-schoolers and they just got introduced to Python, I want to hook them by getting them to make something that they can play around with. Could use this there :D

Somehow, it feels like, CS education is getting both easier as new low friction tools are being made available each day and getting harder because the sheer complexity and number of things to know to really understand anything at all the levels of abstractions is getting more complicated.

reply

gljiva 1 day ago [–]

As a former high-school student who has done many little projects with tkinter at that time, I would recommend against dumping tkinter altogether for several reasons:

However, the reason I did that many projects is that part of my class wasn't really into programming, so if you think that learning and using this will be significantly easier than tkinter basics, go with it, but I would recommend leaving tkinter at least as an option for "those who want to know more", i.e. students who are considering studying CS.

reply

GeorgeTirebiter? 1 day ago [–]

What are the URLs for those two well-written references you mention? Thank you.

reply

smt88 17 hours ago [–]

Honestly I would advise against this for beginning coders. It's very "magical" -- there's no easy way to see how the text translates into a UI. It gives them no additional understanding of GUIs at all, really.

Whenever you encounter a new programming topic for the first time, it helps a lot if you encounter it in a fairly universal format.

I'm not sure about Python, but in most languages, there are simple GUI libraries that accomplish both of your goals: exciting students with interactive programs and helping them more deeply understand how GUIs are usually built and maintained.

reply

atoav 16 hours ago [–]

I would advise to use processing.org in python mode. You can draw shapes like lines and rectangles on your own and use it to visualise the programming flow (e.g. draw many lines at different positions with a for loop, but change the color on every fifth to red).

Stuff like this is IMO the best way to learn programming for average people, because the first two blockers are usually:

Seeing what you do is a good way to deal with these points, because then programming can feel more like drawing a doodle instead of doing your taxes. Once they got the basics you could still dive into the rest.

reply

atoav 16 hours ago [–]

Do you know this: https://py.processing.org/tutorials/ ?

IMO this would be the best way to start learning any programming language, because there is an simple IDE with examples, it is very visual and helps to convey basic programming structures just by doodling around.

If I (as a open source python dev) had to teach teenagers I would start with this

reply

mindv0rtex 1 day ago [–]

For not too elaborate GUIs in Python, I think TraitsUI? is a really neat library: https://github.com/enthought/traitsui

reply

galacticdessert 1 day ago [–]

Based on the very limited documentation it seems to follow a similar approach to PySinpleGUI? https://github.com/PySimpleGUI/PySimpleGUI

PySimpleGUI? is indeed simple and quick, not very elegant but useful for small utilities.

reply

---

Brython/JS canvas graphics API example from view-source:https://brython.info/

<script type="text/python"> """Code for the clock"""

import time import math

from browser import document import browser.timer

sin, cos = math.sin, math.cos width, height = 250, 250 # canvas dimensions ray = 100 # clock ray

background = "SteelBlue?" digits = "#fff" border = "blue"

def needle(angle, r1, r2): """Draw a needle at specified angle in specified color. r1 and r2 are percentages of clock ray. """ x1 = width / 2 - ray * cos(angle) * r1 y1 = height / 2 - ray * sin(angle) * r1 x2 = width / 2 + ray * cos(angle) * r2 y2 = height / 2 + ray * sin(angle) * r2 ctx.beginPath() ctx.strokeStyle = "#fff" ctx.moveTo(x1, y1) ctx.lineTo(x2, y2) ctx.stroke()

def set_clock(): # erase clock ctx.beginPath() ctx.fillStyle = background ctx.arc(width / 2, height / 2, ray * 0.89, 0, 2 * math.pi) ctx.fill()

  1. redraw hours show_hours()
  2. print day now_time = time.time() now = time.localtime(now_time) microsecs = now_time - int(now_time) day = now.tm_mday ctx.font = "bold 14px Arial" ctx.textAlign = "center" ctx.textBaseline = "middle" ctx.fillStyle = "#000" ctx.fillText(day, width * 0.7, height * 0.5)
  3. draw needles for hour, minute, seconds ctx.lineWidth = 2 hour = now.tm_hour % 12 + now.tm_min / 60 angle = hour * 2 * math.pi / 12 - math.pi / 2 needle(angle, 0.05, 0.45) minute = now.tm_min angle = minute * 2 *math.pi / 60 - math.pi / 2 needle(angle, 0.05, 0.7) ctx.lineWidth = 1 second = now.tm_sec + microsecs angle = second * 2 * math.pi / 60 - math.pi / 2 needle(angle, 0.05, 0.8)

def show_hours(): ctx.beginPath() ctx.arc(width / 2, height / 2, ray * 0.05, 0, 2 * math.pi) ctx.fillStyle = digits ctx.fill() for i in range(1, 13): angle = i * math.pi / 6 - math.pi / 2 x3 = width / 2 + ray * cos(angle) * 0.82 y3 = height / 2 + ray * sin(angle) * 0.82 ctx.font = "18px Arial" ctx.textAlign = "center" ctx.textBaseline = "middle" ctx.fillText(i, x3, y3) # cell for day ctx.fillStyle = "#fff" ctx.fillRect(width * 0.65, height * 0.47, width * 0.1, height * 0.06)

canvas = document["clock"]

  1. draw clock border if hasattr(canvas, 'getContext'): ctx = canvas.getContext("2d")
    ctx.beginPath()
    ctx.arc(width / 2, height / 2, ray, 0, 2 * math.pi)
    ctx.fillStyle = background
    ctx.fill()
    browser.timer.set_interval(set_clock, 100)
    show_hours()else: document['navig_zone'].html = "canvas is not supported"

</script>

---

https://github.com/willmcgugan/rich#tables

---

"3) Features desirable for shell include syntax highlighting, autosuggestions,command-not-found helper,tab autocomplete, flag complete. My shell already has it."

---

https://github.com/amoffat/sh

" sh is a full-fledged subprocess replacement for Python 2.6 - 3.8, PyPy? and PyPy?3 that allows you to call any program as if it were a function:

from sh import ifconfig print(ifconfig("eth0")) "

---

i don't agree with all this, but i do with a lot of it:

https://clig.dev/#guidelines

most of my disagreements are covered in the top-voted comments in:

https://news.ycombinator.com/item?id=25304257

---

on shell:

jancsika 6 hours ago [–]

I can just never get past the arg/flag inconsistency/complexity across commands." ...

combatentropy 5 hours ago [–]

You're right. I often think the man page should begin with a few examples, then launch into the neverending list of options.

tutfbhuf 6 hours ago [–]

I wish that shell would be more sane.

I don't think that a shell should be a complete programming language. If you need a programming language, then better use one. There is xonsh if you are looking for something like this.

I think there should be a better bash with an very clean and consistent interface. Some of the most commonly used utils like awk '{ print $2}', sed, grep, sort, ... should be included out of the box to improve performance and to provide consistency across distributions.

reply

ohthehugemanate 5 hours ago [–]

This is exactly the idea behind Powershell. You don't pipe strings between commands, you pipe typed objects etc etc. I'm convinced that if it had better support on Linux it would usher in a new age if shells. As it is though, it would be hard to use it as a daily driver.[1]

[1] https://code.joejag.com/2020/a-month-with-powershell.html

reply

geophile 5 hours ago [–]

Check out my pandemic project, marcel: https://marceltheshell.org. It hits that spot you described: still a shell, but more strongly typed. Instead of bailing out and redoing everything in Python (when you reach the end of your pipeline), you can add a bit of Python code in your command.

For example, find all the .py files under the current directory, find the ones that have changed in the last day, and then print the lines (with their filenames) that define classes:

    ls -fr \
    | select (f: f.suffix == '.py' and now() - f.mtime < days(1)) \
    | read -l \
    | select (path, line: line.startswith('class '))

read -l reads and labels a file, i.e., filename -> stream of (filename, line in file)

(Yes, this is doable with find and awk, but this is just an accessible example that gets across the idea of Python functions on the command line.)

reply

geophile 5 hours ago [–]

Oh, and when you do bail out and go to Python, marcel's operators are available in a module (marcel.api), so that you can take your pipeline and move it into Python easily. E.g. here's the same example in Python:

    from marcel.api import *
    for path, line in (ls(file=True, recursive=True) |
                       select(lambda f: f.suffix == '.py' and 
                              now() - f.mtime < days(1)) |
                       read(label=True) |
                       select(lambda path, line: line.startswith('class '))):
        print(f'({path}: {line})
                       read(label=True)

reply

enriquto 5 hours ago [–]

> Yes, this is doable with find and awk

You don't even need awk for that one:

    grep ^class `find . -mtime -1 | grep py$`

Maybe you could try to use an example where it really makes a difference? Some natural operation that would be very cumbersome with plain shell but is easy and direct with marcel. Otherwise many people may fail to see the point.

reply

---

https://github.com/CalebJohn/shell-coupler

---

https://github.com/dbohdan/structured-text-tools/blob/master/README.md

---

"

mjn on Sept 19, 2013

parent favorite on: Command-line tools for data science

I like Google's crush-tools, which works on delimited data (e.g. tab-delimited), a somewhat simpler and faster format than CSV. Lots of the built-in Unix tools also work on delimited data (cut, join, sort, etc.), but crush-tools fills in some gaps, like being able to do a 'uniq' on specific fields, do outer joins, sum columns, etc.:

https://code.google.com/p/crush-tools/ "

https://news.ycombinator.com/item?id=6413162

---

https://news.ycombinator.com/item?id=26746718 this thread talks a little about TUI stuff with CUA

"There are text mode action-oriented apps: ncurses/turbo vision/etc. based applications like *Commander, or the old Borland IDE, or Tilde, or the old FoxPro?, or plenty others.

And there are graphical command-oriented apps like Jupyter or ReGIS? or Sixel or iTerm or TermKit? or Kui or the command palette in VSCode or Atom or others.

And you can have them mixed like in youtube-dl GUI or ffmpeg GUI where the changes you make in the GUI are reflected in different arguments for the command and you can see and edit the command before executing it. This is similar to getting keyboard shortcut suggestions in menus or tooltips. "

---

BeetleB? 1 day ago [–]

As much as I like the command line, I couldn't help noticing from the first 10 or so entries[1] that the "stop" text consists of one of:

1. Drag and Drop

2. Right clicking

3. Ctrl-C and Ctrl-V

So these 3-5 things do everything in the list in a GUI, and instead the author wants us to learn 35 different command/syntax combinations?

As an aside, I'd like to write an article saying "You Don't Need Github To Write an Article". If you insist on using Github for that purpose, at least do it properly with a static site generator.

[1] Did not bother with the rest.

---

" 4. Structured shell for the masses. Powershell isn't the worst, but in my view nushell has a bright future ahead. For the people who don't need portability, it may well entirely replace bash, Python and perl for writing more maintainable and user-friendly shell scripts. https://nushell.sh/ "

jiggawatts 1 day ago [–]

Stop shipping your org chart!

Microsoft has always had this problem, but with PowerShell? -- which is supposed to be this unified interface to all things Microsoft -- it is glaringly obvious that teams at Microsoft do not talk to each other.

To this day, the ActiveDirectory? commands throw exceptions instead of returning Errors. Are you not allowed to talk to them?

The Exchange "Set" commands, if failing to match the provided user name, helpfully overwrite the first 1,000 users instead because... admins don't need weekends, am I right? Who doesn't enjoy a disaster recovery instead of going to the beach?

I'm what you'd categorise as a power user of PS 5.1, having written many PS1 modules and several C# modules for customers to use at scale. I've barely touched PowerShell? Core because support for it within Microsoft is more miss than hit.

For example, .NET Core has caused serious issues. PowerShell? needs dynamic DLL loading to work, but .NET Core hasn't prioritised that, because web apps don't need it. The runtime introduced EXE-level flags that should have been DLL-level, making certain categories of PowerShell? modules impossible to develop. I gave up. I no longer develop for PowerShell? at all. It's just too hard.

It's nice that Out-GridView? and Show-Command are back, but they launch under the shell window, which makes them hard to find at the best of times and very irritating when the shell is embedded (E.g.: in VS Code)

The Azure commandlets are generally a pain to work with, so I've switched to ARM Templates for most things because PowerShell? resource provisioning scripts cannot be re-run, unlike scripts based on the "az" command line or templates. Graph is a monstrosity, and most of my customers are still using MSOnline and are firmly tied to PS 5.1 for the foreseeable future.

Heaven help you if you need to manage a full suite of Hybrid Office 365 backoffice applications. The connection time alone is a solid 2 minutes. Commands fail regularly due to network or throttling reasons, and scripts in general aren't retry-able as mentioned above. This is a usability disaster.

Last, but not least: Who thought it was a good idea to strip the help content out and force users to jump through hoops to install it? There ought to be a guild of programmers so people like him can be summarily ejected from it!

reply

flower-giraffe 1 day ago [–]

> we're always looking to lower the barrier of entry to get into PowerShell?

I’ve used powershell regularly since way back when (it was still called monad when I first tied it).

I’m extremely comfortable in the Windows environment but even yesterday I found it easiest to shell out to cmd.exe to pipe the output of git fast-export to stop powershell from messing with stdout (line feeds)

I really like the idea of a pipeline that can pass more than text streams but it absolutely has to be zero friction to pipe the output of jq, git (and awk, sed etc for oldies like me) without breaking things.

reply

vdqtp3 1 day ago [–]

The inability to handle simple text is my #1 annoyance. For the rest of them, see jiggawatts's comment

reply

BadInformatics? 2 days ago [–]

In case you haven't seen it already, I found https://news.ycombinator.com/item?id=26779580 to be a pretty succinct list of the biggest stumbling points (latency, telemetry and documentation).

A couple of more specific points I'd like to add after experience writing non-trivial PS scripts:

It's nice to see the docs have had a major overhaul since I last dug into them though :)

reply

https://news.ycombinator.com/item?id=26779580 :

jen20 2 days ago

parent flag favorite on: What was the point of [ “x$var” = “xval” ]?

First, I like PowerShell? - sufficiently much to have used it as a login shell _on macOS_ for a while.

However, you grossly overestimate the “awesome”. To answer each “angry dude” point in turn:

As an interactive shell, nushell [2] is probably the closest thing to the PowerShell? experience which is not tied to .NET Core.

[1]: https://docs.microsoft.com/en-us/powershell/module/microsoft... [2]: https://github.com/nushell/nushell

hiq 2 days ago [–]

> we're always looking to lower the barrier of entry to get into PowerShell?, as it can be quite jarring to someone who's used Bash their whole life (myself included).

apt search powershell returns no meaningful result on Debian unstable. I think that's a big barrier to entry, at least for me and people who deploy using docker images based on Debian and Ubuntu.

reply

zaat 2 days ago [–]

Don't return everything, return what I specifically returned (yeah, I know about objects, talking about everywhere else). I know it will never happen, but one can dream. Painpoints aside, you and your team are doing excellent job. Thank you

Edit: unless you are also responsible for DSC, than I'll take it back. It's terrible.

reply ---

https://github.com/google/zx

	JavaScript for Shell Scripting
  ---

https://github.com/skullchap/chadstr note the 'backquotes'-like functionality eg:

cmd CAT = (cmd){"cat"}; chadstr file = str(CAT, "README.md"); cat README.md

---

https://github.com/chriskiehl/Gooey Turn (almost) any Python 2 or 3 Console Program into a GUI application with one line

https://github.com/google/python-fire Python Fire is a library for automatically generating command line interfaces (CLIs) from absolutely any Python object. ---

gorgoiler on June 4, 2020 [–]

Pipes are wonderful! In my opinion you can’t extol them by themselves. One has to bask in a fuller set of features that are so much greater than the sum of their parts, to feel the warmth of Unix:

(1) everything is text

(2) everything (ish) is a file

(3) including pipes and fds

(4) every piece of software is accessible as a file, invoked at the command line

(5) ...with local arguments

(6) ...and persistent globals in the environment

A lot of understanding comes once you know what execve does, though such knowledge is of course not necessary. It just helps.

Unix is seriously uncool with young people at the moment. I intend to turn that around and articles like this offer good material.

jcranmer on June 4, 2020 [–]

> (1) everything is text

And lists are space-separated. Unless you want them to be newline-separated, or NUL-separated, which is controlled by an option that may or may not be present for the command you're invoking, and is spelled completely differently for each program. Or maybe you just quote spaces somehow, and good luck figuring out who is responsible for inserting quotes and who is responsible for removing them.

---

cowmix on June 4, 2020 [–]

I use pipelines as much as the next guy but every time I see post praise how awesome they are, I'm reminded of the Unix Hater's Handbook. Their take on pipelines is pretty spot on too.

http://web.mit.edu/~simsong/www/ugh.pdf

cuddlybacon on June 4, 2020 [–]

I mostly like what they wrote about pipes. I think the example of bloating they talked about in ls at the start of the shell programming section is a good example: if pipelines are so great, why have so many unix utilities felt the need to bloat?

I think it a result of there being just a bit too much friction in building a pipeline. A good portion tends to be massaging text formats. The standard unix commands for doing that tend to have infamously bad readability.

Fish Shell seems to be making this better by making a string which has a syntax that makes it clear what it is doing: http://fishshell.com/docs/current/cmds/string.html I use fish shell, and I can usually read and often write text manipulations with the string command without needing to consult the docs.

Nushell seems to take a different approach: add structure to command output. By doing that, it seems that a bunch of stuff that is super finicky in the more traditional shells ends up being simple and easy commands with one clear job in nushell. I have never tried it, but it does seem to be movement in the correct direction.

rnestler on June 5, 2020 [–]

> Nushell seems to take a different approach: add structure to command output. By doing that, it seems that a bunch of stuff that is super finicky in the more traditional shells ends up being simple and easy commands with one clear job in nushell. I have never tried it, but it does seem to be movement in the correct direction.

I tried nushell a few times and the commands really compose better due to the structured approach. How would one sort the output of ls by size in bash without letting ls do the sorting? In nushell it is as simple as "ls

sort-by size".

atombender on June 5, 2020 [–]

Pipes are a great idea, but are severely hampered by the many edge cases around escaping, quoting, and, my pet peeve, error handling. By default, in modern shells, this will actually succeed with no error:

  $ alias fail=exit 1
  $ find / | fail | wc -l; echo $?
  0
  0

You can turn on the "pipefail" option to remedy this:

  $ set -o pipefail
  $ find / | fail | wc -l; echo $?
  0
  1

Most scripts don't, because the option makes everything much stricter, and requires more error handling.

Of course, a lot of scripts also forget to enable the similarly strict "errexit" (-e) and "nounset" options (-u), which are also important in modern scripting.

There's another error that hardly anyone bothers to handle correctly:

  x=$(find / | fail | wc -l)

This sets x to "" because the command failed. The only way to test if this succeeded is to check $?, or use an if statement around it:

  if ! x=$(find / | fail | wc -l); then
    echo "Fail!" >&2
    exit 1
  fi

I don't think I've seen a script ever bother do this.

Of course, if you also want the error message from the command. If you want that, you have to start using name pipes or temporary files, with the attendant cleanup. Shell scripting is suddenly much more complicated, and the resulting scripts become much less fun to write.

And that's why shell scripts are so brittle.

codemac on June 5, 2020 [–]

Just use a better shell. rc handles this wonderfully, $? is actually called $status, and it's an array, depending on the number of pipes.

cowmix on June 4, 2020 [–]

I use pipelines as much as the next guy but every time I see post praise how awesome they are, I'm reminded of the Unix Hater's Handbook. Their take on pipelines is pretty spot on too.

http://web.mit.edu/~simsong/www/ugh.pdf

---

is elixir good for making commandline utilities? Mb not, authors here complain that (a) it requires the VM to be installed, (b) the VM is too large to want to include in each executable (c) unlike Perl and Python the VM is not frequently preinstalled on typical linux boxes. Also at least two people complain about startup latency (one person quantifies this as 200ms "boot time"), and about Elixir defaulting to starting one process for each CPU on the system:

https://elixirforum.com/t/is-elixir-adequate-to-build-a-cli-tool/1875/11 https://elixirforum.com/t/how-suitable-is-elixir-for-a-cli-application/36184

regarding startup latency, one person says:

From a pure runtime startup time perspective: perl < python2 < python3 < ruby < java < elixir < nodejs So elixir is not the worst. Hell, many CLI programs are written in nodejs; I cringe every time I run webpack.

---

https://invisible-island.net/ncurses/ncurses-intro.html

---

    5
    andyc 26 hours ago | link | flag | 

Oil has the start of this, called “headless mode” !

http://www.oilshell.org/blog/2021/06/hotos-shell-panel.html#oils-headless-mode-should-be-useful-for-ui-research

It’s a shell divorced from the terminal. A GUI / TUI can communicate with the shell over a Unix domain socket. There is a basic demo that works!

One slogan is that a shell UI should have a terminal (for external commands); it shouldn’t be a terminal.

As mentioned recently I need people to bang on the other side of this to make it happen, since I am more focused on the Oil language, etc.

    ~
    singpolyma 17 hours ago | link | flag | 

I’m curious how this is different from just running the shell and talking to it’s stdout/stdin?

    ~
    andyc Oil Shell Developer 16 hours ago | link | flag | 

A few different issues, not exhaustive:

    If stdout to the shell is a pipe, then invoking say ls --color will inherit the pipe as stdout. This means isatty(stdout) will return false, which means you won’t get color.
        with the headless shell, the GUI can create a TTY and send the FD over the Unix domain socket, and ls will have the TTY as its stdout! This works!
    You don’t know when the output of ls ends and the output of the next command begins
        with the headless shell you can pass a different TTY every single time. Or you can pass a pipe
    You don’t know where the prompt begins and ends, and the where the output of ls begins, etc.
        with the headless shell, you can send commands that render the prompt, return it, and display it yourself in a different area of the GUI

Also, with the headless shell, you can make a GUI completion and history interface. In other words, do what GNU readline does, but do it in a GUI, etc. This makes a lot of sense since say Jupyter notebook and the web browser already have GUIs for history and completion.

(Note there is a bash Jupyter kernel, but it’s limited and doesn’t appear to do any of these things. It appears to scrape stdin/stdout. If anyone has experience I’d be interested in feedback)

~ intercal 16 hours ago

link flag

Terminals offer “capabilities”, stuff like querying the width and height, or writing those weird escapes that change the color. I would guess there would either be no capabilities available to a headless shell, or maybe their own limited set of capabilities emulated or ignored in the UI. I haven’t looked at the source so this is merely speculation.

    ~
    andyc 11 hours ago | link | flag | 

Well a typical usage would be to have a GUI process and a shell process, with the division of labor like this:

    GUI process starts the “headless shell” process (osh --headless), which involves setting up a Unix domain socket to communicate over.
    GUI process allows the user to enter a shell command. This is just a text box or whatever.
    GUI process creates a TTY for the output of this command.
    GUI process sends the command, along with the file descriptor for the TTY over the Unix Domain socket to the headless shell, which sets the file descriptor state, parses, and executes the command
    GUI process reads from the other end of the TTY and renders terminal escape codes

So the point here is that the shell knows nothing about terminals or escape codes. This is all handled in the GUI process.

You could have a shell multiplexer without a terminal multiplexer, etc.

If none of the commands needed a terminal, then the GUI doesn’t even need a terminal. It could just do everything over pipes.

So there is a lot of flexibility in the kinds of UIs you can make – it’s not hard-coded into the shell. The headless shell doesn’t print the prompt, and it doesn’t handle completion or history, etc. Those are all UI issues.

---

from elixir: "Command line can work, there's even escript for shorter tasks: "escript provides support for running short Erlang programs without having to compile them first, and an easy way to retrieve the command-line arguments."

---

in support of lots of cli options (as an unavoidable consequence of having a lot of different cli tools written by a lot of different ppl, e.g. no opportunity to have conventions and refactor across tools; combined with shells passing text streams rather than structured data): https://danluu.com/cli-complexity/

---