Bash/Shell#
Awesome links#
HDD parameters#
Check disk space:
df -h
To list all block devices, run:
lsblk
To list all partitions, run:
fdisk -l
File/Folders#
Copying#
Copy (and synchronize) with rsync#
rsync -azP <FILE_SRC> <FILE_DEST>
rsync -azP <FOLDER_SOURCE> <FOLDER_DEST> # will create FOLDER_SOURCE inside FOLDER_DEST (if it does not exist), and will copy the content of FOLDER_SOURCE inside it
rsync -azP <FOLDER_SOURCE>/ <FOLDER_DEST> # will copy the content of FOLDER_SOURCE inside FOLDER_DEST
# For nii.gz files, no need to further compress so -z can be dropped
Copy from remote station via scp#
scp username@hostname:</PATH_TO_FILE>
Finding#
find . -name "dti*"
To be case-insensitive, use:
find . -iname "dti*"
To only look for folders/directories:
find . -type d -iname "dti*"
To only look for files:
find . -type f -iname "dti*.*"
Deleting#
Delete non-empty folder#
rm -rf <FOLDER>
Delete a bunch of files#
find . -name "dti*" -delete
or the more complicated version:
find . -name "dti*" | while read F; do rm $F; done
When you are trying to delete too many files using rm
, you may get error message: /bin/rm Argument list too long
. Use xargs
to avoid this problem.
find ~ -name β*.logβ -print0 | xargs -0 rm -f
Renaming#
Rename files with a given extension#
ls *.<EXT> | while read F; do mv $F <NEW_FILE_NAME>_$F; done
Do it recursively:
find . -name "t2_seg.nii.gz" -exec bash -c 'mv $(dirname $1)/$(basename $1) $(dirname $1)/t2_seg_manual.nii.gz' -- {} \;
Do something on files modified for the past 10 days
find . -type f -name '*.*' -mtime +10 -exec echo "do something on this file: {}" \;
Set created/modification date on a file
touch -mt YYYYMMDDhhmm <FILE>
On Maverick and later, the creation date is not updated if newer than the existing. So you should use:
SetFile -d 'DD/MM/YYYY HH:MM:SS' <FILE>
Size of folder#
du -sh <FOLDER>
or for all folders in the path
du -csh *
du -sm * | sort -nr # in MB and reverse-ordered by size
du -hcs * | sort -h # in human-readable
Number of files #
== Get number of files that match a pattern
ls -dq *pattern* | wc -l
Get number of files in a folder (recursively)
find .//. ! -name . -print | grep -c //
only counts files modified for the past 24h:
find .//. ! -name . -mtime -1 -print | grep -c //
List files modified for the past 24h
find . -mtime -1 -print
List number of files per folder
find . -maxdepth 1 -mindepth 1 -type d -exec sh -c 'echo "{} : $(find "{}" -type f | wc -l)" file\(s\)' \;
Permissions#
Change permissions#
chmod 644 # make a file readable by anyone and writable by the owner only.
chmod 755 # make a file readable/executable by everyone and writable by the owner only.
chmod 701 # r/w/x for the owner, no access for everyone
Change owner of a file#
sudo chown <OWNER> <FILE>
Look for group owner & permission#
ls -le@a
Find most recently changed files (less than 1 day ago)
find -mtime -1 -ls
Search files
Files with specific string inside:
find . -name "string"
Files that have been modified for the past 24 hours:
find ~/Documents -type f -ctime -0 | more
Stdout / Stderr #
https://askubuntu.com/questions/420981/how-do-i-save-terminal-output-to-a-fileEdit
Compression/Extraction#
tar#
compress:
tar -czf /path/to/output/folder/filename.tar.gz /path/to/folder
extract:
tar -zxvf filename.tar.gz
zip#
compress folder:
zip -r archive.zip folder/
# Exclude a sub-folder:
zip -r archive.zip folder/ -x '*subfoldertoexclude*'
extract:
unzip archive.zip
copie de fichiers ds une directory
find -name sica*.png | xargs -t -i /bin/cp ./{} ./imagesEdit
Checksum #
This procedure creates a unique signature for your files and folders. It enables to check for integrity when you share data.
find FOLDER -type f -exec md5sum {} \; | md5sum
find -s FOLDER -type f -exec shasum {} \; | shasum
find -s FOLDER -type f -exec md5 {} \; | md5
Remove files from tmp #
find . -name "tmp.*" -type d -print0 | xargs -0 /bin/rm -rf
.bash_profile#
The .bash_profile
file is launched when you open a new terminal. You can configure your environment variables from there. It is located in your home folder ($HOME
).
To load it:
source ~/.bash_profile
Emails#
send email
echo "something" | mailx -s "subject" someone@email.com
Processes#
Check Processes#
pstree -ap
ps aux
top
Killing Processes#
kill a process based on PID#
kill -9 <"PID">
kill a process from a user#
pkill -U <USER>
Internet / Network#
Download file from internet#
curl -o filename -L <URL>
# Example for OSF file (note the "?action=download" added after the URL):
curl -o data.zip -L https://osf.io/76jkx/?action=download
Alternatively:
wget -O data <URL>
Copying#
Copy file between computers#
https://clbin.com/ (featuring CLI uploading) ( running rupa/sprunge )
https://paste.fossdaily.xyz/ (running privatebin)
https://paste.tildeverse.org/ (running privatebin)
https://bin.snopyta.org/ (running privatebin)
https://pb.envs.net/ (running privatebin)
https://sebsauvage.net/paste/ (running zerobin; an unmaintained forerunner of privatebin)
https://0bin.net/ (running sametmax/0bin)
https://demo.lufi.io/ (running https://lufi.io)
https://ybits.io/ (closed source but oh well)
https://upload.disroot.org/ (running https://lufi.io/)
https://framadrop.org (running lufi.io)
https://ttm.sh/ (featuring cli uploading) (edited)
Using gist.github.com (only for files <100MB):
1. make a new gist
2. note its ID in its URL (something like 3daa207ea45c75722bd0e3bc914dce3a)
3. `git clone git@github.com:3daa207ea45c75722bd0e3bc914dce3a`
4. `cd 3daa207ea45c75722bd0e3bc914dce3a`
5. add your large file;
6. `git add .; git commit; git push`
Copy from a remote station#
scp username@station.domain: </PATH/FILE> . # copy file
scp username@station.domain: </PATH/> . -r # copy folder
Network/DNS#
List all stations on the network (only works on a server)
findsmb
find DNS
cat /etc/resolv.conf
lookup DNS
host HOST_NAME
host IP_ADDRESS
Clear DNS cache (on OSX 10.8 and later)
sudo killall -HUP mDNSResponder
Connect to another station
ssh IP
or:
ssh username@station.domain
Screen (for background processes)#
Letβs say you connect to a station from your laptop and you wish to launch a script that will run for several hours. If you close your laptop, the remote script will stop. To prevent this, use screen
. It opens a virtual environment from a remote station, so that any script launched within this environment will continue running even if you close your laptop.
Step-by-step procedure:
Connect to a station via
ssh
Launch
screen
. It will create a new screen attached to the station.Do whatever you want (e.g., launch a long process).
Detach from the screen:
screen -d
Or, using a shortcut: press and hold CTRL+A, then hit D.
Attach to a detached screen:
screen -r
Attach to a not detached screen. (Multi display mode).
screen -x
List of your screens
screen -ls
Kill a screen
screen -X kill # (if you only have one screen running)
screen -X -S [session # you want to kill] kill
Or, using a shortcut: press CTRL+D.
Give specific name to a screen session
screen -S <NAME_OF_SESSION>
Scroll inside a screen
Use combination CTRL+A+[. Then, move up and down with the arrow keys (β and β).
SSH Public Key#
Create key on the client (do this only once):
ssh-keygen -t rsa
Copy key on server:
ssh-copy-id demo@198.51.100.0
VIM Text Editor#
Simple but great editor. Usually installed everywhere.
:w = save
:q = quit
:wq = quit and save
Coloured syntax:
vi ~/.vimrc
add: syntax on