Why Sponsor Oils? | source | all docs for version 0.21.0 | all versions | oilshell.org
Warning: Work in progress! Leave feedback on Zulip or Github if you'd like this doc to be updated.
This chapter in the Oils Reference describes builtin commands for OSH and YSH.
Append word arguments to a list:
var mylist = :| hello |
append *.py (mylist) # append all Python files
var myflags = []
append -- -c 'echo hi' (myflags) # -- to avoid ambiguity
It's a shortcut for:
call myflags->append('-c')
call myflags->append('echo hi')
Pretty prints interpreter state. Some of these are implementation details, subject to change.
Examples:
pp proc # print all procs and their doc comments
var x = :| one two |
pp cell x # dump the "guts" of a cell, which is a location for a value
pp asdl (x) # dump the ASDL "guts"
pp line (x) # single-line stable format, for spec tests
Run a block of code, stopping at the first error. In other words, shopt
errexit
is enabled.
Set the _status
variable to the exit status of the block, and return 0.
try {
ls /nonexistent
}
if (_status !== 0) {
echo 'ls failed'
}
Handle expression errors:
try {
var x = 42 / 0
}
And errors from compound commands:
try {
ls | wc -l
diff <(sort left.txt) <(sort right.txt)
}
The case statement can be useful:
try {
grep PATTERN FILE.txt
}
case (_status) {
(0) { echo 'found' }
(1) { echo 'not found' }
(else) { echo "grep returned status $_status" }
}
The try
builtin may also set the _error
register.
Runs a command and requires the exit code to be 0 or 1.
if boolstatus egrep '[0-9]+' myfile { # may abort
echo 'found' # status 0 means found
} else {
echo 'not found' # status 1 means not found
}
The error
builtin interrupts the shell program.
error 'Missing /tmp' # program fails with status 10
Override the default status of 10
with a named argument:
error 'Missing /tmp' (status=99)
In YSH, it's customary to use error
instead of return 1
, since it provides
more information:
proc p {
if ! test -d /tmp {
error 'Missing /tmp' # more descriptive than return
}
echo hi
}
Handle the error with the try
builtin:
try {
p
}
if (_status !== 0) {
echo $[_error.message] # => Missing /tmp
}
The integer _status
is always set, and the Dict _error
is set for all
"structured" errors, which includes errors raised by the try
builtin.
Special properties of _error
:
_error.message
- the positional string arg_error.status
- the named status
arg, or the default 10You can attach other, arbitrary properties to the error:
error 'Oops' (path='foo.json')
They are attached to _error
:
try {
error 'Oops' (path='foo.json')
}
if (_status !== 0) {
echo $[_error.path] # => foo.json
}
It takes a block:
cd / {
echo $PWD
}
It takes a block:
shopt --unset errexit {
false
echo 'ok'
}
Execute a block with a global variable set.
shvar IFS=/ {
echo "ifs is $IFS"
}
echo "ifs restored to $IFS"
Execute a block with a shared "context" that can be updated using the ctx
built-in.
var mydict = {}
ctx push (mydict) {
# = mydict => {}
ctx set (mykey='myval')
}
# = mydict => { mykey: 'myval' }
The context can be modified with ctx set (key=val)
, which updates or inserts
the value at the given key.
The context can also be updated with ctx emit field (value)
.
ctx push (mydict) {
# = mydict => {}
ctx emit mylist (0)
# = mydict => { mylist: [0] }
ctx emit mylist (1)
}
# = mydict => { mylist: [0, 1] }
Contexts can be nested, resulting in a stack of contexts.
ctx push (mydict1) {
ctx set (dict=1)
ctx push (mydict2) {
ctx set (dict=2)
}
}
# = mydict1 => { dict: 1 }
# = mydict2 => { dict: 2 }
ctx
is useful for creating DSLs, such as a mini-parseArgs.
proc parser (; place ; ; block_def) {
var p = {}
ctx push (p, block_def)
call place->setValue(p)
}
proc flag (short_name, long_name; type; help) {
ctx emit flag ({short_name, long_name, type, help})
}
proc arg (name) {
ctx emit arg ({name})
}
parser (&spec) {
flag -t --tsv (Bool, help='Output as TSV')
flag -r --recursive (Bool, help='Recurse into the given directory')
flag -N --count (Int, help='Process no more than N files')
arg path
}
Save global registers like $? on a stack. It's useful for preventing plugins from interfering with user code. Example:
status_42 # returns 42 and sets $?
push-registers { # push a new frame
status_43 # top of stack changed here
echo done
} # stack popped
echo $? # 42, read from new top-of-stack
Current list of registers:
Regex data underlying BASH_REMATCH, _group(), _start(), _end()
$?
_status # set by the try builtin
PIPESTATUS # aka _pipeline_status
_process_sub_status
Runs a named proc with the given arguments. It's often useful as the only top level statement in a "task file":
proc p {
echo hi
}
runproc @ARGV
Like 'builtin' and 'command', it affects the lookup of the first word.
Registers a name in the global module dict. Returns 0 if it doesn't exist, or 1 if it does.
Use it like this in executable files:
module main || return 0
And like this in libraries:
module myfile.ysh || return 0
The is-main
builtin returns 1 (false) if the current file was executed with
the source
builtin.
In the "main" file, including -c
or stdin
input, it returns 0 (true).
Use it like this:
if is-main {
runproc @ARGV
}
TODO
Reuse code from other files, respecting namespaces.
use lib/foo.ysh # relative import, i.ie implicit $_this_dir?
# makes name 'foo' available
Bind a specific name:
use lib/foo.ysh (&myvar) # makes 'myvar' available
Bind multiple names:
use lib/foo.ysh (&myvar) {
var log, die
}
Maybe:
use lib/foo.ysh (&myvar) {
var mylog = myvar.log
}
Also a declaration
use --extern grep sed
YSH adds long flags to shell's read
:
read --all # whole file including newline, in $_reply
read --all (&x) # fills $x
And a convenience:
read -0 # read until NUL, synonym for read -r -d ''
TODO: We used to have read --line
, but buffered I/O doesn't mix with shell
I/O, which is reads directly from file descriptors.
write fixes problems with shell's echo
builtin.
The default separator is a newline, and the default terminator is a newline.
Examples:
write -- ale bean # write two lines
write -n -- ale bean # synonym for --end '', like echo -n
write --sep '' --end '' -- a b # write 2 bytes
write --sep $'\t' --end $'\n' -- a b # TSV line
You may want to use toJson8()
or toJson()
before writing:
write -- $[toJson8(mystr)]
write -- $[toJson(mystr)]
The preferred alternative to shell's &
.
fork { sleep 1 }
wait -n
The preferred alternative to shell's ()
. Prefer cd
with a block if possible.
forkwait {
not_mutated=zzz
}
echo $not_mutated
Write JSON:
var d = {name: 'bob', age: 42}
json write (d)
Read JSON:
echo hi | json read # fills $_reply by default
Or use an explicit place:
var x = ''
json read (&x) < myfile.txt
Related: json-encode-err and json-decode-error
Like json
, but on the encoding side:
b'\yff'
instead of lossy Unicode replacement charOn decoding side:
b'' u''
stringsRelated: json8-encode-err and json8-decode-error
TODO: describe
TODO: when
These builtins take input and output. They're often used with redirects.
read FLAG* VAR*
Read a line from stdin, split it into tokens with the $IFS
algorithm,
and assign the tokens to the given variables. When no VARs are given,
assign to $REPLY
.
Note: When writing ySH, prefer the extensions documented in
ysh-read. The read
builtin is confusing because -r
needs to
be explicitly enabled.
Flags:
-a ARRAY assign the tokens to elements of this array
-d CHAR use DELIM as delimiter, instead of newline
-n NUM read up to NUM characters, respecting delimiters
-p STR print the string PROMPT before reading input
-r raw mode: don't let backslashes escape characters
-s silent: do not echo input coming from a terminal
-t NUM time out and fail after TIME seconds
-t 0 returns whether any input is available
-u FD read from file descriptor FD instead of 0 (stdin)
echo FLAG* ARG*
Prints ARGs to stdout, separated by a space, and terminated by a newline.
Flags:
-e enable interpretation of backslash escapes
-n omit the trailing newline
See char-escapes.
printf FLAG* FMT ARG*
Formats values and prints them. The FMT string contain three types of objects:
\t
. See char-escapes.%s
that specify how to format each each ARG.If not enough ARGS are passed, the empty string is used. If too many are passed, the FMT string will be "recycled".
Flags:
-v VAR Write output in variable VAR instead of standard output.
Format specifiers:
%% Prints a single "%".
%b Interprets backslash escapes while printing.
%q Prints the argument escaping the characters needed to make it reusable
as shell input.
%d Print as signed decimal number.
%i Same as %d.
%o Print as unsigned octal number.
%u Print as unsigned decimal number.
%x Print as unsigned hexadecimal number with lower-case hex-digits (a-f).
%X Same as %x, but with upper-case hex-digits (A-F).
%f Print as floating point number.
%e Print as a double number, in "±e" format (lower-case e).
%E Same as %e, but with an upper-case E.
%g Interprets the argument as double, but prints it like %f or %e.
%G Same as %g, but print it like %E.
%c Print as a single char, only the first character is printed.
%s Print as string
%n The number of characters printed so far is stored in the variable named
in the argument.
%a Interprets the argument as double, and prints it like a C99 hexadecimal
floating-point literal.
%A Same as %a, but print it like %E.
%(FORMAT)T Prints date and time, according to FORMAT as a format string
for strftime(3). The argument is the number of seconds since
epoch. It can also be -1 (current time, also the default value
if there is no argument) or -2 (shell startup time).
Alias for mapfile
.
mapfile FLAG* ARRAY?
Reads lines from stdin into the variable named ARRAY (default
${MAPFILE[@]}
).
Flags:
-t Remove the trailing newline from every line
These builtins accept shell code and run it.
source SCRIPT ARG*
Executes SCRIPT with given ARGs in the context of the current shell. It will modify existing variables.
eval ARG+
Creates a string by joining ARGs with a space, then runs it as a shell command.
Example:
# Create the string echo "hello $name" and run it.
a='echo'
b='"hello $name"'
eval $a $b
Tips:
eval
can confuse code and user-supplied data, leading to security
issues.eval
.YSH eval:
var myblock = ^(echo hi)
eval (myblock) # => hi
trap FLAG* CMD SIGNAL*
Registers the shell string CMD to be run after the SIGNALs are received. If the CMD is empty, then the signal is ignored.
Flags:
-l Lists all signals and their signal number
-p Prints a list of the installed signal handlers
Tip:
Prefer passing the name of a shell function to trap
.
The set
and shopt
builtins set global shell options. YSH code should use
the more natural shopt
.
set FLAG* ARG*
Sets global shell options. Short style:
set -e
Long style:
set -o errexit
Set the arguments array:
set -- 1 2 3
shopt FLAG* OPTION* BLOCK?
Sets global shell options.
Flags:
-s --set Turn the named options on
-u --unset Turn the named options off
-p Print option values
-q Return 0 if the option is true, else 1
Examples:
shopt --set errexit
You can set or unset multiple options with the groups strict:all
,
ysh:upgrade
, and ysh:all
.
If a block is passed, then the mutated options are pushed onto a stack, the block is executed, and then options are restored to their original state.
These 5 builtins deal with the working directory of the shell.
cd FLAG* DIR
Changes the working directory of the current shell process to DIR.
If DIR isn't specified, change to $HOME
. If DIR is -
, change to $OLDPWD
(a variable that the sets to the previous working directory.)
Flags:
-L Follow symbolic links, i.e. change to the TARGET of the symlink.
(default).
-P Don't follow symbolic links.
pwd FLAG*
Prints the current working directory.
Flags:
-L Follow symbolic links if present (default)
-P Don't follow symbolic links. Print the link instead of the target.
pushd DIR
Add DIR to the directory stack, then change the working directory to DIR.
Typically used with popd
and dirs
.
popd
Removes a directory from the directory stack, and changes the working directory
to it. Typically used with pushd
and dirs
.
dirs FLAG*
Shows the contents of the directory stack. Typically used with pushd
and
popd
.
Flags:
-c Clear the dir stack.
-l Show the dir stack, but with the real path instead of ~.
-p Show the dir stack, but formatted as one line per entry.
-v Like -p, but numbering each line.
These builtins implement our bash-compatible autocompletion system.
Registers completion policies for different commands.
Generates completion candidates inside a user-defined completion function.
It can also be used in scripts, i.e. outside a completion function.
Changes completion options inside a user-defined completion function.
Adjusts COMP_ARGV
according to specified delimiters, and optionally set
variables cur, prev, words (an array), and cword. May also set 'split'.
This is an OSH extension that makes it easier to run the bash-completion project.
Complete an entire shell command string. For example,
compexport -c 'echo $H'
will complete variables like $HOME
. And
compexport -c 'ha'
will complete builtins like hay
, as well as external commands.
These builtins mutate the state of the shell process.
exec BIN_PATH ARG*
Replaces the running shell with the binary specified, which is passed ARGs. BIN_PATH must exist on the file system; i.e. it can't be a shell builtin or function.
umask MODE?
Sets the bit mask that determines the permissions for new files and directories. The mask is subtracted from 666 for files and 777 for directories.
Oils currently supports writing masks in octal.
If no MODE, show the current mask.
times
Shows the user and system time used by the shell and its child processes.
jobs
Shows all jobs running in the shell and their status.
wait FLAG* ARG
Wait for processes to exit.
If the ARG is a PID, wait only for that job, and return its status.
If there's no ARG, wait for all child processes.
Flags:
-n Wait for the next process to exit, rather than a specific process.
Wait can be interrupted by a signal, in which case the exit code indicates the signal number.
fg JOB?
Returns a job running in the background to the foreground. If no JOB is specified, use the latest job.
test OP ARG
test ARG OP ARG
[ OP ARG ] # [ is an alias for test that requires closing ]
[ ARG OP ARG ]
Evaluates a conditional expression and returns 0 (true) or 1 (false).
Note that [ is the name of a builtin, not an operator in the language. Use 'test' to avoid this confusion.
String expressions:
-n STR True if STR is not empty.
'test STR' is usually equivalent, but discouraged.
-z STR True if STR is empty.
STR1 = STR2 True if the strings are equal.
STR1 != STR2 True if the strings are not equal.
STR1 < STR2 True if STR1 sorts before STR2 lexicographically.
STR1 > STR2 True if STR1 sorts after STR2 lexicographically.
Note: < and > should be quoted like \< and \>
File expressions:
-a FILE Synonym for -e.
-b FILE True if FILE is a block special file.
-c FILE True if FILE is a character special file.
-d FILE True if FILE is a directory.
-e FILE True if FILE exists.
-f FILE True if FILE is a regular file.
-g FILE True if FILE has the sgid bit set.
-G FILE True if current user's group is also FILE's group.
-h FILE True if FILE is a symbolic link.
-L FILE True if FILE is a symbolic link.
-k FILE True if FILE has the sticky bit set.
-O FILE True if current user is the file owner.
-p FILE True if FILE is a named pipe (FIFO).
-r FILE True if FILE is readable.
-s FILE True if FILE has size bigger than 0.
-S FILE True if FILE is a socket file.
-t FD True if file descriptor FD is open and refers to a terminal.
-u FILE True if FILE has suid bit set.
-w FILE True if FILE is writable.
-x FILE True if FILE is executable.
FILE1 -nt FILE2 True if FILE1 is newer than FILE2 (mtime).
FILE1 -ot FILE2 True if FILE1 is older than FILE2 (mtime).
FILE1 -ef FILE2 True if FILE1 is a hard link to FILE2.
Arithmetic expressions coerce arguments to integers, then compare:
INT1 -eq INT2 True if they're equal.
INT1 -ne INT2 True if they're not equal.
INT1 -lt INT2 True if INT1 is less than INT2.
INT1 -le INT2 True if INT1 is less or equal than INT2.
INT1 -gt INT2 True if INT1 is greater than INT2.
INT1 -ge INT2 True if INT1 is greater or equal than INT2.
Other expressions:
-o OPTION True if the shell option OPTION is set.
-v VAR True if the variable VAR is set.
The test builtin also supports POSIX conditionals like -a, -o, !, and ( ), but these are discouraged.
Oils supports these long flags:
--dir same as -d
--exists same as -e
--file same as -f
--symlink same as -L
getopts SPEC VAR ARG*
A single iteration of flag parsing. The SPEC is a sequence of flag characters,
with a trailing :
to indicate that the flag takes an argument:
ab # accept -a and -b
xy:z # accept -x, -y arg, and -z
The input is "$@"
by default, unless ARGs are passed.
On each iteration, the flag character is stored in VAR. If the flag has an
argument, it's stored in $OPTARG
. When an error occurs, VAR is set to ?
and $OPTARG
is unset.
Returns 0 if a flag is parsed, or 1 on end of input or another error.
Example:
while getopts "ab:" flag; do
case $flag in
a) flag_a=1 ;;
b) flag_b=$OPTARG" ;;
'?') echo 'Invalid Syntax'; break ;;
esac
done
Notes:
$OPTIND
is initialized to 1 every time a shell starts, and is used to
maintain state between invocations of getopts
.:
and ?
can't be flags.Unimplemented.
Usage: help TOPIC?
Examples:
help # this help
help echo # help on the 'echo' builtin
help com-sub # help on command sub $(date)
help oils-usage # identical to oils-for-unix --help
help osh-usage # osh --help
help ysh-usage # ysh --help
hash
Display information about remembered commands.
hash FLAG* CMD+
Determine the locations of commands using $PATH
, and remember them.
Flag:
-r Discard all remembered locations.
type FLAG* NAME+
Print the type of each NAME, if it were the first word of a command. Is it a shell keyword, builtin command, shell function, alias, or executable file on $PATH?
Flags:
-a Show all possible candidates, not just the first one
-f Don't search for shell functions
-P Only search for executable files
-t Print a single word: alias, builtin, file, function, or keyword
Modeled after the bash type
builtin.
command FLAG* CMD ARG*
Look up CMD as a shell builtin or executable file, and execute it with the given ARGs. That is, the lookup ignores shell functions named CMD.
Flags:
-v Instead of executing CMD, print a description of it.
Similar to the 'type' builtin.
builtin CMD ARG*
Look up CMD as a shell builtin, and execute it with the given ARGs. That is, the lookup ignores shell functions and executables named CMD.
alias NAME=CODE
Make NAME a shortcut for executing CODE, e.g. alias hi='echo hello'
.
alias NAME
Show the value of this alias.
alias
Show a list of all aliases.
Tips:
Prefer shell functions like:
ls() {
command ls --color "$@"
}
to aliases like:
alias ls='ls --color'
Functions are less likely to cause parsing problems.
\ls
or 'ls'
disables alias expansionunalias NAME
Remove the alias NAME.
history FLAG*
Display and manipulate the shell's history entries.
history NUM
Show the last NUM history entries.
Flags:
-c Clears the history.
-d POS Deletes the history entry at position POS.
Bash has this, but OSH won't implement it.
YSH includes a command-line argument parsing utility called parseArgs
. This
is intended to be used for command-line interfaces to YSH programs.
To use it, first import args.ysh
:
source --builtin args.ysh
Then, create an argument parser specification:
parser (&spec) {
flag -v --verbose (help="Verbosely") # default is Bool, false
flag -P --max-procs ('int', default=-1, help='''
Run at most P processes at a time
''')
flag -i --invert ('bool', default=true, help='''
Long multiline
Description
''')
arg src (help='Source')
arg dest (help='Dest')
rest files
}
Finally, parse ARGV
(or any other array of strings) with:
var args = parseArgs(spec, ARGV)
The returned args
is a Dict
containing key-value pairs with the parsed
values (or defaults) for each flag and argument. For example, given
ARGV = :| mysrc -P 12 mydest a b c |
, args
would be:
{
"verbose": false,
"max-procs": 12,
"invert": true,
"src": "mysrc",
"dest": "mydest",
"files": ["a", "b", "c"]
}
parseArgs()
requires a parser specification to indicate how to parse the
ARGV
array. This specification should be constructed using the parser
proc.
parser (&spec) {
flag -f --my-flag
arg myarg
rest otherArgs
}
In the above example, parser
takes in a place &spec
, which will store the
resulting specification and a block which is evaluated to build that
specification.
Inside of a parser
block, you should call the following procs:
flag
to add --flag
optionsarg
to add positional argumentsrest
to capture remaining positional arguments into a listparser
will validate the parser specification for errors such as duplicate
flag or argument names.
parser (&spec) {
flag -n --name
flag -n --name # Duplicate!
}
# => raises "Duplicate flag/arg name 'name' in spec" (status = 3)
flag
should be called within a parser
block.
parser (&spec) {
flag -v --verbose
}
The above example declares a flag "--verbose" and a short alias "-v".
parseArgs()
will then store a boolean value under args.verbose
:
true
if the flag was passed at least oncefalse
otherwiseFlags can also accept values. For example, if you wanted to accept an integer count:
parser (&spec) {
flag -N --count ('int')
}
Calling parseArgs
with ARGV = :| -n 5 |
or ARGV = :| --count 5 |
will
store the integer 5
under args.count
. If the user passes in a non-integer
value like ARGV = :| --count abc |
, parseArgs
will raise an error.
Default values for an argument can be set with the default
named argument.
parser (&spec) {
flag -N --count ('int', default=2)
# Boolean flags can be given default values too
flag -O --optimize ('bool', default=true)
}
var args = parseArgs(spec, :| -n 3 |)
# => args.count = 2
# => args.optimize = true
Each name passed to flag
must be unique to that specific parser
. Calling
flag
with the same name twice will raise an error inside of parser
.
arg
should be called within a parser
block.
parser (&spec) {
arg query
arg path
}
The above example declares two positional arguments called "query" and "path".
parseArgs()
will then store strings under args.query
and args.path
. Order
matters, so the first positional argument will be stored to query
and the
second to path
. If not enough positional arguments are passed, then
parseArgs
will raise an error.
Similar to flag
, each arg
name must be unique. Calling arg
with the same
name twice will cause parser
to raise an error.
rest
should be called within a parser
block.
parser (&spec) {
arg query
rest files
}
Capture zero or more positional arguments not already captured by arg
. So,
for ARGV = :| hello file.txt message.txt README.md |
, we would have
args.query = "file.txt"
and args.files = ["file.txt", "message.txt", "README.md"]
.
Without rest, passing extraneous arguments will raise an error in
parseArgs()
.
rest
can only be called once within a parser
. Calling it multiple times
will raise an error in parser
.
Given a parser specification spec
produced by parser
, parse a list of
strings (usually ARGV
.)
var args = parseArgs(spec, ARGV)
The returned args
is a dictionary mapping the names of each arg
, flag
and
rest
to their captured values. (See the example at the start of this
topic.)
parseArgs
will raise an error if the ARGV
is invalid per the parser
specification. For example, if it's missing a required positional argument:
parser (&spec) {
arg path
}
var args = parseArgs(spec, [])
# => raises an error about the missing 'path' (status = 2)