Ubuntu 14.04.1.
I have a bash script, called by cron every 10 minutes, which basically looks for files in a subdir, then loops through and process each file. But how do I check if there are no files found? If there are no files found I don't want to process them, and I don't want to get an email via cron that says "no files found", because cron runs every 10 minutes. That's 144 emails per day saying "no files found" that I don't want to get.
- The input/ dir is owned by me and has full rwx permissions.
- I've already guaranteed the files in input/ do not contain spaces, thanks to another answer on Ask Ubuntu.
Here's my basic script.
#!/bin/bash
# Cron requires a full path in $myfullpath
myfullpath=/home/comp/progdir
files=`ls $myfullpath/input/fedex*.xlsx`
# How do I check for no files found here and exit without generating a cron email?
for fullfile in $files
do
doneThanks! I didn't even know what to google for this one.
EDIT: My script is now this:
#!/bin/bash
# Script: gocronloop, Feb 5, 2015
# Cron requires a full path to the file.
mydir=/home/comp/perl/gilson/jimv/fedex
cd $mydir/input
# First remove spaces from filesnames in input/
find -name "* *" -type f | rename 's/ /-/g'
cd $mydir
shopt -s nullglob
if [ $? -ne 0 ]; then echo "ERROR in shopt" exit 1
fi
for fullfile in "$mydir"/input/fedex*.xlsx do # First remove file extension to get full path and base filename. myfile=`echo "$fullfile"|cut -d'.' -f1` echo -e "\nDoing $myfile..." # Convert file from xlsx to xls. ssconvert $myfile.xlsx $myfile.xls # Now check status in $? if [ $? -ne 0 ]; then echo "ERROR in ssconvert" exit 1 fi perl $1 $mydir/fedex.pl -input:$mydir/$myfile.xls -progdir:$mydir done 3 7 Answers
First things first: Don't parse ls.
Now that we have got that out of the way, use globbing, alongwith nullglob:
shopt -s nullglob
for fullfile in "$myfullpath"/input/fedex*.xlsx
do
#.......
doneUsually with globbing, if * doesn't match anything it's left as is. With nullglob, it is replaced with nothing, so a false match isn't triggered.
For example:
$ bash -c 'a=(foo/*); echo ${a[@]}'
foo/*
$ bash -c 'shopt -s nullglob; a=(foo/*); echo ${a[@]}'
$ 10 If you're dead-set on using ls anyway, despite it's unsuitability for your original code, or if you:
just want to find out if ls didn't find any files
you could check it's exit code. A "No such file..." will fail
(exit code 2). While even an empty directory's ls will succeed (exit code 0):
$ ls *.xls
ls: cannot access *.xls: No such file or directory
$ echo $?
2
$ ls
$ echo $?
0 4 Let find do the hard work for you. Write a script that processes a file passed as the first parameter, then do this in your crontab:
find /wherever -iname 'fedex*.xls' -exec your-script "{}" \;find will not generate any output if it doesn't find files matching the expression.
Python seems a comfortable option as well if I am not missing the point:
#!/usr/bin/env python3
import subprocess
import os
myfullpath = "/home/jacob/Bureaublad"
files = [f for f in os.listdir(myfullpath) if f.endswith(".xlsx")]
for f in files: cmd = "gedit" subprocess.check_call(["/bin/bash", "-c", cmd]) 2 If you look at , something like that should work:
cd "$myfullpath/input/"
if test -n "$(shopt -s nullglob; echo fedex*.xlsl)"
then for file in fedex*.xlsl do fullfile="$myfullpath/input/$file" # things done
fi ... Look also at
5$ if ! ls /tmp/*.bla >/dev/null 2>&1 ; then echo "no meat" ; else echo "have a steak" ; fi
no meat
$ touch /tmp/a.bla
$ if ! ls /tmp/*.bla >/dev/null 2>&1 ; then echo "no meat" ; else echo "have a steak" ; fi
have a steak 3 Why not use the find command, putting a single filename on each line:
for fullfile in $(find <dir> -name '*.xslx'); do
# fullfile now contains the full filename, including any spaces
# process to your heart's content, using double quotes (") around $fullfile to
# make sure the spaces are kept intact cp "$fullfile" /new/directory/
end 2