Helpful Linux/Mac Commands
Cat files to your clipboard
cat yourfile.txt | pbcopy
Size Info
On macOS, you can use the du (disk usage) command. Here are the most common ways:
Basic command:
du -sh /path/to/folder
The flags mean:
-s= summary (shows total only, not each subfolder)-h= human-readable format (shows KB, MB, GB instead of bytes)
Example:
du -sh ~/Documents
This might output something like: 1.2G /Users/yourname/Documents
If you want to see the size of each subfolder too:
du -h /path/to/folder
To see just the current directory:
du -sh .
Alternative using find and ls (less common but works):
find /path/to/folder -type f -exec ls -l {} \; | awk '{sum += $5} END {print sum/1024/1024 " MB"}'
The du -sh command is definitely the simplest and most straightforward option. It automatically includes all files in subdirectories in its calculation.
Count Number of Files
On macOS, here are several ways to count files in a folder and all its subfolders:
Simplest method using find:
find /path/to/folder -type f | wc -l
For the current directory:
find . -type f | wc -l
If you want to count only files (excluding directories):
find /path/to/folder -type f | wc -l
If you want to count both files AND directories:
find /path/to/folder | wc -l
Count only directories:
find /path/to/folder -type d | wc -l
Count files with a specific extension (e.g., .txt files):
find /path/to/folder -type f -name "*.txt" | wc -l
Using ls with recursion (alternative, but less reliable):
ls -lR /path/to/folder | grep "^-" | wc -l
The find method is the most reliable and standard approach. The flags mean:
-type f= only files (not directories)-type d= only directorieswc -l= count lines (which equals the number of items found)
List Largest Files
Here are several ways to list the largest files on macOS:
Using find and du (recommended):
find /path/to/folder -type f -exec du -h {} + | sort -rh | head -20
This shows the 20 largest files in human-readable format (MB, GB, etc.).
For the current directory:
find . -type f -exec du -h {} + | sort -rh | head -20
Using ls with sorting:
find /path/to/folder -type f -ls | sort -k7 -rn | head -20
More detailed output with full paths:
find /path/to/folder -type f -exec ls -lh {} \; | awk '{print $5, $9}' | sort -rh | head -20
Search entire system (may take a while and require sudo):
sudo find / -type f -exec du -h {} + 2>/dev/null | sort -rh | head -20
To see the top 10 instead of 20: Change head -20 to head -10
Alternative using du for a quick overview:
du -ah /path/to/folder | sort -rh | head -20
The flags mean:
-type f= files onlysort -rh= sort in reverse order, human-readable numbershead -20= show top 20 results2>/dev/null= suppress error messages
The first command (find with du and sort -rh) is usually the best option as it handles human-readable sizes correctly and gives you clean output.
Find files greater than a certain size
find /path/ -type f -n +64M (finds files over 64 Mb)
File Transfer
Using rsync
rsync -Ph user@example.com:/var/lib/backup/foo.backup .
-h makes the units of measure “human-readable”
The -P flag is shorthand for --partial --progress, which:
--partialkeeps partially downloaded files--progressshows transfer progress
For interrupted downloads, simply run the same command again:
bash
rsync -avP user@remote:/path/to/file /local/destination/
If the transfer was interrupted, rsync will check what’s already been transferred and continue from that point.
Additional useful options:
--append– append data to shorter files (useful if you know files only grow)--append-verify– like append but verifies the existing data with checksums--partial-dir=DIR– store partial files in a specific directory instead of the destination
Complete example for resuming a large download:
bash
rsync -avP --partial-dir=/tmp/rsync-partial user@server:/remote/file.zip /local/path/
Just rerun the exact same rsync command you used originally, and it will automatically detect what’s already been transferred and resume from there. The -P flag is really the key – it ensures partial files are kept so they can be resumed.
Alternatively you can use scp, but any failure in the download requires a restart from the beginning.
scp user@example.com:/var/lib/backup/foo.backup ~/foo.backup
List What is Listening on What Port and its PID
On macOS, you can use the lsof command to list what’s listening on which ports along with process IDs:
bash
sudo lsof -iTCP -sTCP:LISTEN -n -P
Breaking down the options:
-iTCP: Shows only TCP connections-sTCP:LISTEN: Filters for listening sockets only-n: Prevents DNS lookups (faster)-P: Shows port numbers instead of service names
This will show output with columns including the process name, PID, user, and the port it’s listening on.
If you want to find what’s listening on a specific port (e.g., port 8080):
bash
sudo lsof -iTCP:8080 -sTCP:LISTEN
Alternatively, you can use netstat:
bash
netstat -anv | grep LISTEN
Though netstat won’t show PIDs directly. To get the PID with netstat, you’d need to cross-reference the output.
The lsof approach is generally more straightforward for getting both the port and PID information you need.
Zip
ZIP files
To zip files without including the directory structure on Mac, use the -j (junk paths) option:
bash
zip -j output.zip /path/to/directory/*
This will create output.zip containing all the files from that directory, but without preserving the directory structure.
Example:
bash
zip -j my_files.zip ~/Documents/MyFolder/*
A few notes:
- The
-jflag tells zip to store just the file names, ignoring the path information - The
*at the end grabs all files in that directory - If you want to include files from subdirectories too (flattening everything), you can use:
bash
cd /path/to/directory
find . -type f -exec zip -j ../output.zip {} +
- If you need to exclude hidden files, the
*wildcard already does this by default. To include them, add.*as well.
List Zip File Contents
To list files in a zip archive from the command line on Mac, use:
unzip -l filename.zip
This shows a detailed list with file sizes, dates, and names.
Other useful options:
- Just the filenames (no extra info):
unzip -Z1 filename.zip - Verbose listing (more details including compression ratios):
unzip -v filename.zip - Test the archive (checks if files can be extracted):
unzip -t filename.zip
Example output of unzip -l:
Archive: my_files.zip
Length Date Time Name
--------- ---------- ----- ----
1234 12-23-2024 10:30 file1.txt
5678 12-23-2024 10:31 file2.pdf
--------- -------
6912 2 files
Unzip to a directory
To unzip files to a new folder, use the -d option:
unzip filename.zip -d /path/to/destination
Examples:
# Unzip to a new folder called "extracted"
unzip my_files.zip -d extracted
# Unzip to a specific path
unzip my_files.zip -d ~/Documents/extracted_files
# Unzip to current directory (default behavior)
unzip my_files.zip
Notes:
- The destination folder will be created automatically if it doesn’t exist
- Files will extract with their original directory structure (if any was preserved in the zip)
- If you want to extract without creating subdirectories, add the
-jflag:unzip -j filename.zip -d destination_folder