Prev  
 

Commonly used commands and scripts

  • Systemctl related commands:
    • Systemd common commands (start / stop / restart / status) (enable / disable for boot control)
    • List current Systemd operating units: sudo systemctl list-units | grep '*'. Change or remove the grep statement as required.
    • Check the boot journal: sudo journalctl -xe
  • To check running process with open for openvpn(/del, for deluge) ps -A | grep open
  • To change time zone from command line: sudo dpkg-reconfigure tzdata.
  • Networkd → systemd-networks
  • Resolved → systemd-resolved
  • sudo apt update to update apt packages
  • sudo apt upgrade to upgrade apt packages, after running update
  • sudo apt install package to install a package
  • sudo apt list --installed to list installed packages, use grep to filter, eg.: sudo apt list --installed | grep php* to see all installed php packages.
  • sudo apt remove package to remove a package, wild cards an be used, e.g. sudo apt remove php7.4* to remove all installed php7.4 packages
package description mail server nextcloud wiki
php-apcu object read caching
php-apcu-bc
php-bcmath
libapache2-mod-php apache2 php module
php-common Common php file modules
php-curl php client for URLs module
php-gd php image libray module
php-igbinary php binary type storage
php-imagick image manipulation module
php-json
php-mail-mime
php-mailparse
php-mbstring
php-mime-type
php-mysql
php-pear
php-redis
php-xmlrpc
php7.4-cli
php7.4-common
php7.4-curl
php7.4-dev
php7.4-gd
php7.4-imap
php7.4-intl
php7.4-json
php7.4-mbstring
php7.4-mysql
php7.4-opcache
php7.4-readline
php7.4-xml
php7.4-xmlrpc
php7.4-zip
pkg-php-tools

Install the following file to allow the php sudo vim /var/www/html/test.php:

?php
 phpinfo();
?>

The following is taken from my server root crontab, sudo crontab -e:

# Edit this file to introduce tasks to be run by cron.
# 
# Each task to run has to be defined through a single line
# indicating with different fields when the task will be run
# and what command to run for the task
# 
# To define the time you can provide concrete values for
# minute (m), hour (h), day of month (dom), month (mon),
# and day of week (dow) or use '*' in these fields (for 'any').
# 
# Notice that tasks will be started based on the cron's system
# daemon's notion of time and timezones.
# 
# Output of the crontab jobs (including errors) is sent through
# email to the user the crontab file belongs to (unless redirected).
# 
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
# 
# For more information see the manual pages of crontab(5) and cron(8)
# 
# m  h dom mon dow   command
# Example of job definition:
# .---------------- minute (0 - 59)
# |   .------------- hour (0 - 23)
# |   |   .---------- day of month (1 - 31)
# |   |   |   .------- month (1 - 12) OR jan,feb,mar,apr ...
# |   |   |   |   .---- day of week (0 - 6) (Sunday=0 or 7) OR sun,mon,tue,wed,thu,fri,sat,sun
# |   |   |   |   |
# *   *   *   *   *   user-name command to be executed
  0   0   1   *   * /home/shared/Myscripts/rotatelog.sh /var/log/UPSLog.Log%Rotate NUT UPS log file
  0,5,10,15,20,25,30,35,40,45,50,55 * * * * /home/shared/Myscripts/UPSScan.sh 2>/dev/null #Runs Nut scan and logs formated results to /var/log/UPSLog.Log
 #0   *   *   *   * /etc/webmin/bandwidth/rotate.pl
  15  0   *   *   1 /home/shared/Myscripts/server_backup.sh weekly # Run weekly backup script every Monday at 00:15
  15  0  1-7  *   * [ $(date +%u) -eq 3 ] && /home/shared/Myscripts/server_backup.sh monthly # run monthly backup script every Wednesday at 00:15
  15  0   * 3,6,9,12 5 [ $(date +%-d) -le 7 ] && /home/shared/Myscripts/server_backup.sh quarterly # run quarterly backup script every Friday at 00:15 every 3rd month (March, June, September, December).

As per man crontab.5

Note: The day of a command's execution can be specified in the following two fields — 'day of month', and 'day of  week'. If both fields are restricted (i.e., do not contain the "*" character), the command will be run when either field matches
the current time.  For example, "30 4 1,15 * 5" would cause a command to be run at 4:30 am on the 1st and 15th of each month, plus every Friday.

Hence, the crontab entry 15 0 1-7 * 3 command would run the command on the 1 to 7 days of the month and also every Wednesday.
So, the an additional filter is added to the command statement to allow desired functionality. [ $(date +%-d) -le 7 ] && command means the command will only be run if the date day of month is less than or equal to 7. Similarly [ $(date +%u) -eq 3 ] && command means the command will only run if the day of week is 3 (Wednesday).
See man date for information on the date function and return formats.


  • man man to get manual for man
  • man apropos - manual for apropos, used to search the manual page names and description
  • apropos -e -w crontab to list available manual sections of crontab. Try this first.
  • man crontab.1 will display crontab(1)
  • man crontab.5 will display crontab(5)
The table below shows the section numbers of the manual followed by the types of pages they contain.

       1   Executable programs or shell commands
       2   System calls (functions provided by the kernel)
       3   Library calls (functions within program libraries)
       4   Special files (usually found in /dev)
       5   File formats and conventions, e.g. /etc/passwd
       6   Games
       7   Miscellaneous (including macro packages and conventions), e.g. man(7), groff(7)
       8   System administration commands (usually only for root)
       9   Kernel routines [Non standard]

  • man find
  • find -name 'filename*' this will search for 'filename*' in current directory
  • find /mnt/disk1 -name findme* to find findme* under sub-directory /mnt/disk1. The ' ' is not required unless there is white space in the directory or file name.
  • find -path ./con*

awk, grep, sed

Blatantly plagiarised from Linux/BSD command line wizardry: Learn to think in sed, awk, and grep

Trying to read the man pages for the utilities most frequently seen in these extended command chains didn't make them seem more approachable, either. For example, the sed man page weighs in at around 1,800 words alone without ever really explaining how regular expressions work or the most common uses of sed itself.

If you find yourself in the same boat, grab a beverage and buckle in. Instead of giving you encyclopedic listings of every possible argument and use case for each of these ubiquitous commands, we're going to teach you how to think about them—and how to easily, productively incorporate them in your own daily command-line use.

Redirection 101

Before we can talk about sed, awk, and grep, we need to talk about something a bit more basic—command-line redirection. Again, we're going to keep this very simple:

Operator Function Example
&& Process the command on the right after the command on the left has completed successfully.
; Process the command on the right after you're done processing the command on the left. echo one ; echo two
> Place the output of the thing on the left in the empty file named on the right. ls /home/me > myfilesonce.txt ; ls /home/me > myfilesonce.txt
>> Append the output of the thing on the left to the end of the existing file on the right. ls /home/me > myfilestwice.txt ; ls /home/me >> myfilestwice.txt
< Use the file on the right as the standard input of the command on the left. cat < sourcefile > targetfile
| Pipe the standard output of the thing on the left into the standard input of the thing on the right. echo “test123” | mail -s “subjectline” emailaddress

Understanding these redirection operators is crucial to understanding the kinds of wizardly command lines you're presumably here to learn. They make it possible to treat individual, simple utilities as part of a greater whole.

And that last concept—breaking one complex task into several simpler tasks—is equally necessary to learning to think in complex command-line invocations in the first place!

When first learning about tools like grep, I find it helps to think of them as far simpler than they truly are. In that vein, grep is the tool you use to find lines that contain a particular string of text.

For example, let's say you're interested in finding which ports the apache web server has open on your system. Many utilities can accomplish this goal; netstat is one of the older and better-known options. Typically, we'd invoke netstat using the -anp arguments—for all sockets, numeric display, and displaying the owning pid of each socket.

Unfortunately, this produces a lot of output—frequently, several tens of pages. You could just pipe all that output to a pager, so you can read it one page at a time, with netstat -anp | less. Or, you might instead redirect it to a file to be opened with a text editor: netstat -anp > netstat.txt.

But there's a better option. Instead, we can use grep to return only the lines we really want. In this case, what we want to know about is the apache webserver. So:

We introduced some new commands above: head, which limits output to the first n lines and then truncates it. There's also wc, which, with the argument -l, tells you how many lines of text hit its standard input.

So we can translate the four commands above into plain English:

  1. sudo netstat -anp | head -n5 : “Find all the open network sockets, but limit output to the first five lines.”
  2. sudo netstat -anp | wc -l : “Find all the open network sockets, then tell me how many total lines of text you'd have used to tell me.”
  3. sudo netstat -anp | grep apache : “Find all the open network sockets, but only show me the results that include the word 'apache.'”
  4. sudo netstat -anp | head -n2 ; sudo netstat -anp | grep apache : “Find all the open network sockets, but only show me the two header lines—then do it again, but only show me the 'apache' results.”

By thinking of grep as something much simpler than it actually is, we can jump immediately to finding productive ways to use it—and we can chain these simple uses together to easily describe more complex tasks!

Once you're comfortable with using grep to find simple strings as seen above, it can do far more complex tasks. These include but are not limited to: case-insensitive use, more complex patterns (including full regular expressions), exclusion (only show me lines that don't include the pattern), and much, much more. But don't worry about that until after you're familiar with simple grep uses. Once you start, it's truly hard to imagine life without grep anymore!

Now that you know how to limit output to matching (or nonmatching) lines, the next step is learning how to change that output on the fly. For this, sed—the Stream EDitor—will be your tool of choice.

In order to use sed, you need to understand at least a little about regular expressions (regexes). We are once again going to ignore the vast majority of what regular expressions can do and focus on the most immediately intuitive and useful: simple pattern replacement.

Let's say that you want to change all instances of dog to snake in a bunch of text:

me@banshee:~$ echo "I love my dog, dogs are great!"
I love my dog, dogs are great!

me@banshee:~$ echo "I love my dog, dogs are great!" | sed 's/dog/snake/'
I love my snake, dogs are great!

me@banshee:~$ echo "I love my dog, dogs are great!" | sed 's/dog/snake/g'
I love my snake, snakes are great!

We can translate these three commands into plain English:

  1. say “I love my dog, dogs are great!”
  2. say “I love my dog, dogs are great!” but change the first instance of dog to snake.
  3. say “I love my dog, dogs are great!” but change all instances of dog to snake.

Although we're really just working with plain text, sed actually thinks in regular expressions. Let's unpack the regex s/dog/snake/g: it means to search sed's input for instances of dog and replace them with snake and do so globally. Without the g on the end, sed only makes a single replacement per line of text, as we see in command #2.

Alright, now that we understand the simplest possible regular expressions, what might we use sed for on a real-world command line? Let's return to our first example, in which we looked for open network sockets belonging to apache. This time, let's say we want to know which program opened a socket on port 80:

me@banshee:~$ sudo netstat -anp | grep ::80 tcp6 0 0 :::80 :::* LISTEN 4011/apache2

me@banshee:~$ sudo netstat -anp | grep ::80 | sed 's/.*LISTEN *//' 4011/apache2

In the first command, we look for any line containing the string ::80, which limits us to the program running on the standard HTTP port. In the second, we do the same thing—but we discard all the information prior to the PID and mutex (display name) of the process that owns that socket.

In regex language, . is a special character that matches any single character, and * is a special character that matches any sequence of the preceding characters. So .* means “match any number of any characters,” and * (a space followed by an asterisk) means “match any number of spaces.”

This kind of preliminary processing can make reading a text file full of tons of output much easier later—or it can serve to parse “human friendly” command output down to something that can be passed to another utility as an argument later.

Again, there is far, far more to both sed and regular expressions than we see here—but just like grep, I recommend getting comfortable with the most basic use of sed until it feels natural. Wait to go man-page diving until after you're solid on basic use. Only then should you try to slowly, steadily expand your repertoire!

Once you get comfortable with sed and grep, you'll start to feel like a superhero—until you realize how hard it is to get only the relevant information out of a single column in the middle of a line. That's where awk comes in. It's worth noting that awk is even more potentially complex (and capable) than either sed or grep were—in fact, if you're enough of an awk wizard, you could technically replace both sed and grep in most use cases.

That's because awk is actually an entire scripting language, not just a tool—but that's not how we're going to use it, or think of it, as relative newbies. Instead, we're just going to think of awk as a column extractor. Once again, we'll return to our netstat example. What if we want to find out which port Apache is running on?

Once again, we'll translate our examples into plain English: -Find all open sockets and the programs that own them, but limit output to the ones with the text 'apache' in them. -Find all open sockets and the programs that own them, but limit output to the ones with the text 'apache' in them—and limit that output to the fourth tabular column only. -Find all open sockets and the programs that own them, but limit output to the ones with the text 'apache' in them—and limit that output to the fourth and seventh tabular columns.

Since awk is an entire language, its syntax may feel slightly tortured. You need to encapsulate its command arguments in single quotes, then in curly brackets, and you have to use the keyword print in addition to your column numbers. But that syntax will feel like second nature before you know it—and the seemingly over complex syntax makes it possible later to use awk for more complex tasks, like calculating running sums and even averages:

In the above examples, we add the value of the specified column—the second column, specified with $2—to a variable we name SUM. After adding the value of column 2 in each row to SUM, we can either output SUM directly, or we can divide it by a predefined special variable, NR, which means “Number of Rows.”

And yes, if you're wondering, awk handles decimals fine, as well as varying amounts of whitespace in between columns:

As always, I strongly encourage you to get comfortable with the most basic use of awk—finding data by which column it's in rather than trying to hunt for it by identifying text before and after it—before worrying about its fancier, more complex uses.

The IP command notes. Call up information only:

  • ip addr or ip a for short, shows the network IP addresses assigned for each link
  • ip link or ip l for short, shows the network link devices
  • ip route or ip r for short, shows the routing table
  • ip maddr or ip m for short, displays multicast ip addresses
  • ip neigh or ip n for short, shows the neighbour objects (arp table for IP4)
  • ip help for overall help
  • ip a help for specific help for ip a

ethtool query or control network driver and hardware settings

  • ethtool -i eno1 Display driver information for eno1
  • ethtool -S eno1 Display network and driver statistics for eno1

ss display socket statistics

  • ss -a show all sockets (listening ornon-listening)

Home Server Index

Home Server Other Index


  • /mnt/shared/www/dokuwiki/data/pages/home_server/home_server_setup/other_services/misc.txt
  • Last modified: 2022-09-05 Mon wk36 14:59
  • by baumkp