Ravas coding goodies
- Ed_P
- Contributor
- Posts: 8374
- Joined: 06 Feb 2013, 22:12
- Distribution: Cinnamon 5.01 ISO
- Location: Western NY, USA
Ravas coding goodies
FWIW I don't understand why you needed the quotes on the my.name or how the next two lines would work with the # sign in front of them.
Ed
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
Important info of what I wrote in this post about editing a script while it is executed to be found here: Ravas coding goodies (Post by Rava #80280)
Do read the #80280 post first prior trying any coding tweaks from this very article!
The example is not to be taken literally. A more real life file name would be more like "One Epic Whatever Part 3.extension"
Since the source URL is missing yet, the line is deactivated.
You can still start the first download by starting the script, and when you got the 2nd URL, presuming you have to look for it (like I explained in the above post, due to an extra expiration or token info you have to look up one by one, or the need solving captcha or waiting a certain amount of seconds or whatnot you need to do to get the actual direct URL for downloading the file) it could take some time to get several URLs.
But the first download is already running in the background, and since bash or sh read a script not in advance, you can edit and add info in the lines not yet read by the shell, and when you save the script file and the shell has finished the first line, aka the first download, it then would start the 2nd download, even when you have not had that info ready when the script got initially started - and if you not manage to get the consecutive URLs ready in time, the script would just end and you will get no wget errors, sincewould result in a wget error:
I hope I can make myself clear enough what I mean by all that, it is a bit complicated to explain well in a language that's not my native tongue.
Just try it out for yourself: best use a script like described above with the #!/bin/sh ( or #!/bin/bash ) and start the script when you only have finished completing the first wget line - and while the script downloads the first file, you can add the 2nd and 3rd and nth wget URL, and save the file as soon as you completed the next wget line (by adding the URL and giving the file the wanted file name), while in the background wget downloads one file after the other, starting with the next file even when that line was deactivated when you started the script, just remember to save the file when you finished the next wget line, and do so from above to below, not in random or reverse order, cause sh and bash read any script from beginning to end, line by line, never in advance, so saving changes to a script that is already started means bash will react to the changes as soon as it reaches the line.
Do read the #80280 post first prior trying any coding tweaks from this very article!
I not need them for my.name, I altered all info of the script and the output. E.g. there is no such URL like https://what.ever, I even made up the TLD of .ever .
The example is not to be taken literally. A more real life file name would be more like "One Epic Whatever Part 3.extension"
I explained that in a post above.
Since the source URL is missing yet, the line is deactivated.
You can still start the first download by starting the script, and when you got the 2nd URL, presuming you have to look for it (like I explained in the above post, due to an extra expiration or token info you have to look up one by one, or the need solving captcha or waiting a certain amount of seconds or whatnot you need to do to get the actual direct URL for downloading the file) it could take some time to get several URLs.
But the first download is already running in the background, and since bash or sh read a script not in advance, you can edit and add info in the lines not yet read by the shell, and when you save the script file and the shell has finished the first line, aka the first download, it then would start the 2nd download, even when you have not had that info ready when the script got initially started - and if you not manage to get the consecutive URLs ready in time, the script would just end and you will get no wget errors, since
Code: Select all
wget -c "" -O somename
Code: Select all
guest@porteus:~$ wget -c "" -O somename
http://: Invalid host name.
Just try it out for yourself: best use a script like described above with the #!/bin/sh ( or #!/bin/bash ) and start the script when you only have finished completing the first wget line - and while the script downloads the first file, you can add the 2nd and 3rd and nth wget URL, and save the file as soon as you completed the next wget line (by adding the URL and giving the file the wanted file name), while in the background wget downloads one file after the other, starting with the next file even when that line was deactivated when you started the script, just remember to save the file when you finished the next wget line, and do so from above to below, not in random or reverse order, cause sh and bash read any script from beginning to end, line by line, never in advance, so saving changes to a script that is already started means bash will react to the changes as soon as it reaches the line.
Last edited by Rava on 10 Dec 2020, 05:19, edited 1 time in total.
Reason: added top bold and red info
Reason: added top bold and red info
Cheers!
Yours Rava
Yours Rava
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
Important info of what I wrote in posts above about editing a script while it is executed:
When executing a script via file manager of the DE the whole script is preloaded - at least in my case, I use uxterm doing so. So this would not work.
2nd info:
Let us presume you have a .wget.sh prepared in the target folder that looks like so:You now want to add the URLs from top to bottom and start the script via a terminal - not via the DE file manager (see above for why) - as soon as you required the first URL, it then looks like so:now after saving .wget.sh you can start it via and while the 1st download is, well, downloading in the background you add the 2nd URL and disable the first echo info and read pause:and so one, when the current download is still running you can disable the echo and read line below it, when it is already finished you need it or else the script would have moved on - and reached a part not yet ready for execution.
Now let us presume currently the 2nd download is running and you finished getting the 3rd URL, the saved script would look like so:so far so good…
and now you decide to add even more URLs for consecutive downloads by first addingin between and
That would work, right?
Spoiler alert!
It would not work. While bash or the bash that acts as sh is reading the next line of the executed script not in advance, it knows how many lines the currently executed script had when it started executing it.
So, adding previously non existing lines annoys bash and results in errors like this:Cave! The error is not because of a missing `"' - it is that bash expected the script to have only a certain amount of lines and you adding lines break that assumption of bash, resulting in the above error.
Lessons hopefully learned:
If executed via xterminal or VT (virtual terminal) - not by double clicking on the script via your DE file manager - you can add info and disable or enable lines by adding or removing the # at line start while the script is executed, and bash or bash acting as sh honours the changes as soon as it reaches the line - you can never add new lines to a script already being executed.
indeed in the examples of editing a script that is executed I learned from Ravas coding goodies (Post by Rava #80044) up to this very post… and could be I still miss some pitfalls
When executing a script via file manager of the DE the whole script is preloaded - at least in my case, I use uxterm doing so. So this would not work.
2nd info:
Let us presume you have a .wget.sh prepared in the target folder that looks like so:
Code: Select all
#!/bin/sh
#wget -c "" -O "part 1.extension"
echo press enter 1
read
#wget -c "" -O "part 2.extension"
echo press enter 2
read
#wget -c "" -O "part 3.extension"
echo press enter 3
read
echo press enter xx
read
Code: Select all
#!/bin/sh
wget -c "https://what.ever/123/456/789.ext?token=bla&expires=123&id=456" -O "part 1.extension"
echo press enter 1
read
#wget -c "" -O "part 2.extension"
echo press enter 2
read
#wget -c "" -O "part 3.extension"
echo press enter 3
read
echo press enter xx
read
Code: Select all
guest@porteus:/mnt/sdz2/downloads$ ./.wget.sh
Code: Select all
#!/bin/sh
wget -c "https://what.ever/123/456/789.ext?token=bla&expires=123&id=456" -O "part 1.extension"
#echo press enter 1
#read
wget -c "https://what1.ever/123a/456b/789c.ext?token=bla1&expires=123&id=456" -O "part 2.extension"
echo press enter 2
read
#wget -c "" -O "part 3.extension"
echo press enter 3
read
echo press enter xx
read
Now let us presume currently the 2nd download is running and you finished getting the 3rd URL, the saved script would look like so:
Code: Select all
#!/bin/sh
wget -c "https://what.ever/123/456/789.ext?token=bla&expires=123&id=456" -O "part 1.extension"
#echo press enter 1
#read
wget -c "https://what1.ever/123a/456b/789c.ext?token=bla1&expires=123&id=456x" -O "part 2.extension"
#echo press enter 2
#read
wget -c "https://what3.ever/123bxxx/456cxxx/789dxxx.ext?token=bla123&expires=123&id=456a" -O "part 3.extension"
#echo press enter 3
#read
echo press enter xx
read
and now you decide to add even more URLs for consecutive downloads by first adding
Code: Select all
#wget -c "" -O "part 4.extension"
echo press enter 4
read
#wget -c "" -O "part 5.extension"
echo press enter 5
read
Code: Select all
#echo press enter 3
#read
Code: Select all
echo press enter xx
read
Spoiler alert!
It would not work. While bash or the bash that acts as sh is reading the next line of the executed script not in advance, it knows how many lines the currently executed script had when it started executing it.
So, adding previously non existing lines annoys bash and results in errors like this:
Code: Select all
./.wget.sh: line 37: unexpected EOF while looking for matching `"'
./.wget.sh: line 43: syntax error: unexpected end of file
Lessons hopefully learned:
If executed via xterminal or VT (virtual terminal) - not by double clicking on the script via your DE file manager - you can add info and disable or enable lines by adding or removing the # at line start while the script is executed, and bash or bash acting as sh honours the changes as soon as it reaches the line - you can never add new lines to a script already being executed.
indeed in the examples of editing a script that is executed I learned from Ravas coding goodies (Post by Rava #80044) up to this very post… and could be I still miss some pitfalls
Cheers!
Yours Rava
Yours Rava
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
Rava wrote: ↑12 Jan 2021, 17:19donald, I created this script thinking of you.
Place it in $PATH - but go to your video folder and use a relative path.
E.g. go to /mnt/sda6/video and execute it like so:orCode: Select all
# make-ffplay-script Filme
but not like soCode: Select all
# make-ffplay-script Filme/
and it will create Filme.sh that plays all supported files in alphanumerical sorted way - makes more sense for a series than for diverse collection of movies.Code: Select all
/somewhere/else # make-ffplay-script /mnt/sda6/video/Filme/
Filme/ is meant to be an existing folder in $PWD containing supported files.
It is meant to use relative paths not absolute ones so that any drive with the videos can be played regardless if its mounted as sdb or sdz.
Cave! For now it not sets Filme.sh +x for everyone, but just add this as last linehere the code of make-ffplay-scriptCode: Select all
chmod a+x "$PARAMETER".sh
The unusual high version number explains itself by the fact it had been based on a script creating m3u playlists that had the version of 3.1Code: Select all
#!/bin/sh # V3.2 - $1 is the relative folder with the files, will be stripped of any "/" - e.g. Filme or Filme/ works but not /mnt/sda1/video/Filme nor /mnt/sda1/video/Filme/ # for now supported file endings are # avi flv mp3 mp4 mpg mkv ogv TS ts webm divx VOB vob PARAMETER=$(echo "$1"|sed 's|/||g') if [ -d "$PARAMETER" ]; then echo working on "$PARAMETER"… writing into "$PARAMETER".sh echo '#!/bin/sh' >"$PARAMETER".sh find "$PARAMETER" |grep -E "\.avi$|\.flv$|\.mp[34g]$|\.mkv$|\.ogv$|\.[tT][sS]$|\.webm$|\.divx$|\.[vV][oO][bB]$" |sort | while read line ; do { echo ffplay -autoexit -hide_banner -i \"${line}\" >>"$PARAMETER".sh } ; done else echo Given parameter "$PARAMETER" is not a directory. Abort. exit 1 fi
Cheers!
Yours Rava
Yours Rava
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
Ravas coding goodies today share code of a dear forum.porteus.org member relevant to the above make-ffplay-script by donald.
Rava wrote: ↑14 Jan 2021, 13:03[ q to jump to next file ] - what a surprise, same as with my script.donald wrote: ↑14 Jan 2021, 09:32So if I'm really too lazy to click one after the other I could do something like this:[ q to jump to next file ]Code: Select all
find /mnt/sda4/Professor-T/ -type f -name "*.mp4" | sort | while read f; do ffplay -autoexit -- "$f"; done
Still, my script could be considered having a tiny advantage towards your otherwise neat solution.
Both cannot be easily terminated, [ q ] just quits the current ffplay, and the next is started. When there are dozens video files in one folder, having to wait till next ffplay is open, [ q ] quits current ffplay, wait till next ffplay is open, [ q ] quits … … could easily be annoying.
In my case, you need to(unless you have more scripts doing whatnot all named after your video folders - but why have named these as such ) while in your solution you should look if there is not some background (cron) job using find - you probably do not want to terminate.Code: Select all
killall Professor-T.sh
But aside from the rare need to terminate even the not yet started ffplay's - nice solution.
Maybe could even be made into the file manager association "play all supported media files in folder with ffplay"
Cheers!
Yours Rava
Yours Rava
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
That's just one of the many awesome things with an *N*X* system (Unix, Linux, BSD) - you made a mistake and discover an ability of a program you not knew about.
This time it was diff.
I wanted to compare different versions of my main aliases and functions file that gets loaded into every root or guest shell - /usr/local/bin/aliasset.
But instead of typing
diff /path/one/rootcopy/usr/local/bin/aliasset /path/two/backup/usr/local/bin/aliasset I executed instead
diff /path/one/rootcopy/usr/local/bin/ /path/two/backup/usr/local/bin/
and what do you know diff did do?
Complain about missing files to compare?
Nope, much better: it treated the directories as if they were text files and reported a long list like so
Only in /path/one/: whatever.sh
could be incorporated into scripts - you have a sample setup folder with several needed files - and a simple diff would report if one or more needed setup files are missing in the actual in-use setup directory and report and/or warn the user about missing essential files.
This time it was diff.
I wanted to compare different versions of my main aliases and functions file that gets loaded into every root or guest shell - /usr/local/bin/aliasset.
But instead of typing
diff /path/one/rootcopy/usr/local/bin/aliasset /path/two/backup/usr/local/bin/aliasset I executed instead
diff /path/one/rootcopy/usr/local/bin/ /path/two/backup/usr/local/bin/
and what do you know diff did do?
Complain about missing files to compare?
Nope, much better: it treated the directories as if they were text files and reported a long list like so
but also the diff I wantedOnly in /path/one/rootcopy/usr/local/bin: monitorrotate.LVDS-0.right
Only in /path/one/rootcopy/usr/local/bin: mount-Lsfind
That comparing of directories with info likediff /path/one/rootcopy/usr/local/bin/aliasset /path/two/backup/usr/local/bin/aliasset
2c2
< # aliasset V3.6.37 (2021-01-08)
---
> # aliasset V3.6.36 (2020-12-28)
[…]
Only in /path/one/: whatever.sh
could be incorporated into scripts - you have a sample setup folder with several needed files - and a simple diff would report if one or more needed setup files are missing in the actual in-use setup directory and report and/or warn the user about missing essential files.
Cheers!
Yours Rava
Yours Rava
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
another snippet I already posted elsewhere:1024calc - an awk script, for calculating file sizes in 1024 x 1024 etcetera, or in other words: real B KB MB GB TB or PB.
When executed without parameter it displays this gem:Rava wrote: ↑22 Dec 2018, 01:55Code: Select all
root@porteus:/# 1024calc 454656 444 KB 444.00 KB root@porteus:/# cat /usr/local/bin/1024calc #!/usr/bin/awk -f BEGIN{ x = ARGV[1] split("B KB MB GB TB PB",type) for(i=5;y < 1;i--) y = x / (2**(10*i)) print y " " type[i+2] printf("%.2f %s\n",y,type[i+2]) }
Code: Select all
guest@porteus:~$ 1024calc
awk: /usr/local/bin/1024calc:8: fatal: division by zero attempted
Cheers!
Yours Rava
Yours Rava
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
Want a quick working overview about the number of files (normal or hidden) or folders (also normal or hidden) ? Just use ls and wc like so:
As you can see we have 377 files and folders total, and 352 normal files and folders, the difference are the hidden files or folders.
And we have 183 folders in total, and 179 normal ones. The difference are the hidden folders.
Try figuring out for yourself it the file or folder results count the special folders of "." and ".." - or not - and why.
Code: Select all
guest@porteus:/mnt/sdb2/video$ ls |wc
352 1044 7331
guest@porteus:/mnt/sdb2/video$ ls -a |wc
377 1087 7958
guest@porteus:/mnt/sdb2/video$ ls -d */|wc
179 519 3474
guest@porteus:/mnt/sdb2/video$ ls -da */ .*/|wc
183 523 3503
And we have 183 folders in total, and 179 normal ones. The difference are the hidden folders.
Try figuring out for yourself it the file or folder results count the special folders of "." and ".." - or not - and why.
Cheers!
Yours Rava
Yours Rava
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
Small but nice:
mysensors.sh
CAVE! The script is stupid and you have to adjust it to the number of your CPU-cores manually.
Alternatively instead of |tail -n n+1|head -n n you can use grep "Core "
How To code the script with grep "Core " instead of |tail -n n+1|head -n n … that I leave to you. By now everyone reading this thread should be able to pull that of.
Instead of:you get this:
I like the date+time info, especially when comparing more than one instance of data. Same with fxsx, sx, fx, dx or now also mysensors.sh.
mysensors.sh
Code: Select all
#!/bin/sh
function tz () {
echo $(date +%d.%m.%Y\ %H:%M:%S) ____________________________________________________________
}
tz
sensors|tail -n 5|head -n 4
Alternatively instead of |tail -n n+1|head -n n you can use grep "Core "
How To code the script with grep "Core " instead of |tail -n n+1|head -n n … that I leave to you. By now everyone reading this thread should be able to pull that of.
Instead of:
Code: Select all
guest@porteus:~$ sensors
coretemp-isa-0000
Adapter: ISA adapter
Core 0: +63.0°C (high = +84.0°C, crit = +100.0°C)
Core 1: +55.0°C (high = +84.0°C, crit = +100.0°C)
Core 2: +59.0°C (high = +84.0°C, crit = +100.0°C)
Core 3: +55.0°C (high = +84.0°C, crit = +100.0°C)
Code: Select all
guest@porteus:~$ mysensors.sh
02.10.2021 07:53:57 ____________________________________________________________
Core 0: +58.0°C (high = +84.0°C, crit = +100.0°C)
Core 1: +56.0°C (high = +84.0°C, crit = +100.0°C)
Core 2: +59.0°C (high = +84.0°C, crit = +100.0°C)
Core 3: +56.0°C (high = +84.0°C, crit = +100.0°C)
Cheers!
Yours Rava
Yours Rava
-
- Full of knowledge
- Posts: 2074
- Joined: 17 Jun 2013, 13:17
- Distribution: Porteus 3.2.2 XFCE 32bit
- Location: Germany
Ravas coding goodies
run sensors + date in a loop every n seconds and get the output in terminal and
nicely logged in a file.
nicely logged in a file.
Code: Select all
#!/bin/sh
logfile=/home/guest/cpu-temp.log
while true
do
echo $(date +%d.%m.%Y\ %H:%M:%S) | tee -a $logfile
echo "$(sensors | grep -i core)" | tee -a $logfile
echo | tee -a $logfile
sleep 5
done
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
^
nice, I would prefer this:
grep without -i needs significantly less resources.
By itself run once meaningless, but run every 5 seconds…
nice, I would prefer this:
Code: Select all
#!/bin/sh
logfile=/home/guest/cpu-temp.log
while true
do
echo $(date +%d.%m.%Y\ %H:%M:%S) | tee -a $logfile
echo "$(sensors | grep Core)" | tee -a $logfile
echo | tee -a $logfile
sleep 5
done
grep without -i needs significantly less resources.
By itself run once meaningless, but run every 5 seconds…
Cheers!
Yours Rava
Yours Rava
-
- Full of knowledge
- Posts: 2074
- Joined: 17 Jun 2013, 13:17
- Distribution: Porteus 3.2.2 XFCE 32bit
- Location: Germany
Ravas coding goodies
Yep, -i is not necessary in this case.
"running every 5sec"; that was just for testing.
I think that the CPU-Temp isn't changing that fast -- if the fan is not broken --
so i would use higher values like 20 or so.
"running every 5sec"; that was just for testing.
I think that the CPU-Temp isn't changing that fast -- if the fan is not broken --
so i would use higher values like 20 or so.
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
I have an indexing system that creates a find.gz and ls.gz listing of each partition of each drive. All these sit on an approx 60 MB file that is created as an ext2-filesystem and mounted at each Porteus startup.
When I use zgrep to search, you see the difference between e.g. "[Aa]ntwort" or "-i antwort", the 2nd one is slower oh so much.
Of course 60 MB of gzip compressed simple text files is an huge amount in uncompressed size.
Especially since creating a log-entry every 5 seconds makes for a huge log file.
Anyhow, I would still prefer the manual way.
Calling first top4x (see initial post) and when one process has almost all CPU time to itself (in my system that's 800%, so one process having 750% is a lot), and then I would call mysensors.sh (with the "grep Core" instead of |tail -n n+1|head -n n ) as often as needed, in between calling top4x to see how the CPU is doing.
________________________
Of course one could create a script that analyses what top reports, a script able to know that in my 4 Core system each core can handle2 tasks at once, so 800% CPU max. And a script then calculating that >600% (in my case only) is the threshold for reporting the program via top and adding the mysensors.sh info…
But I do not bother creating such a script, questionable that with the few machines I have I would be able to test if such script gets the CPU capabilities correct…
Cheers!
Yours Rava
Yours Rava
-
- Shogun
- Posts: 434
- Joined: 02 May 2017, 09:51
- Distribution: v3.2.2-32 and Porteus-Artix-64
- Location: Chennai,India
Ravas coding goodies
Porteus-5.0rc3 has got conky. Conky-Gotham shows CPU temp every few seconds..Why a script and log file?
Linux Kernel-4.4.272 -32 bit; Linux Kernel-5.4.185 - 64 bit
- Rava
- Contributor
- Posts: 5416
- Joined: 11 Jan 2011, 02:46
- Distribution: XFCE 5.01 x86_64 + 4.0 i586
- Location: Forests of Germany
Ravas coding goodies
Is it really part of any flavour of Port 5.0rc3?
As in: including Openbox, including XFCE 4.16, or XFCE 4.12.
Cheers!
Yours Rava
Yours Rava