Gufodotto would like you to read these:

Monday, June 30, 2008

Windows is for Wimps

'cause, can you do this?

main=$PWD; for dir in `\ls`; do [ -d $main/$dir ] && cd $main/$dir && echo && echo && echo $main/$dir"\n" && echo && /ppr/cadd/pkgs/openeye/bin/babel3 -i `\ls -c1 *.oeb.gz| head -n1` -o $dir.sdf -n 1 && cp -v $dir.sdf $dir-on-GCRS.ROCS.sdf && for gzed in `\ls *.oeb.gz`; do [ -f $PWD/$gzed ] && sdf=`echo $gzed | sed s/'.oeb.gz'/'.sdf'/` && /ppr/cadd/pkgs/openeye/bin/babel3 -i $gzed -o $sdf -skip 1 && cat $sdf>>$dir-on-GCRS.ROCS.sdf && rm -v $sdf && sleep 1s; done; python ../sortmols.py --in=$dir-on-GCRS.ROCS.sdf --out=$dir-on-ROCS.oeb.gz --blanks_first --sdtag=ComboScore && rm -rv $dir-on-GCRS.ROCS.sdf && sleep 3s; done; cd $main;


or this?

while read line; do echo "${line}"; option=${line}; opt=`echo $option| sed s/\ //g`; echo $option \=\> $opt; for data in `echo IV1 IV2 IS`; do grep -A 100 Evaluation */$data/*-$data.$opt | tee logs/onTrain/${data}${opt}-performances.log; done; done < <(cat option.run)

4 comments:

PAC said...

Ach... i'm a little scared but curios: what do those scripts produce?!

Maybe windows doesn't allow you to write souch things, but... it looks to me like some kind of mental hara-kiri!!!

Gufo said...

it saves me days of work...

they're just everyday one-liners that I put together to automate obnoxious tasks, such as cycling through different parameters when training a model, or ordering said results so that I can pick the best one.

PAC said...

And you still understand the meaning of that things, one week after having write them?! O_O

You need some training in Software Engineering, IMHO ^_^

Gufo said...

I usually do, and if I don't I just have to read them carefully to glean enough to modify them to the new situation. Or, I just write them anew. I have a txt file where I dump clever solutions. Remeber, these are supposed to be quick and dirty alternatives to opening N files ina text editor, performing pattern matching substitution and so on, then copying the files somewhere else, zipping some of them, running a script on other whose name is derived from the first ones, deleting the input files then analyse the results. Or something else like that. I rarely use them again exactly like that. But if I do, then it helps that the directory structures is inserted in a generic way (e.g. using $PWD, and derived variables as much as possible.)