Showing posts with label *nix. Show all posts
Showing posts with label *nix. Show all posts

Wednesday, July 10, 2019

QNAP SSH/RSYNC traffic shaped on uplink from Orange fiber due to CoS/QoS/DSCP

I've spent some time on this and there were some things I learned in the process that are not very straightforward. 
It all started with me finding out the sync process for data from my NAS got super slow (128KB/s -> 1 MBit/s). It uses ssh with rsync and I was able to replicate this by checking scp transfer out being capped as well.
After digging around I've isolated it to the provider (or their router). There is nothing on the router that would indicate throttling of ssh. When I tried SCP towards a different port I was still capped. WTH? When I quickly spun-up openvpn instance the throttling went away even with the extra overhead and using rsync/ssh.
There must be something that detects SSH with deep packet inspection, I've concluded and called my provider (Orange SK). They said all should be okay and told me to call them when I'm on site. Since that is going to be in two weeks I kept digging. Reaching out to my friends on my facebook, one of the guys (Juraj Lutter) mentioned that I should check DSCP.
And indeed setting -o IPQoS throughput with my scp command made my transfer fast again. The simple explanation is that default is lowlatency(for interactive) and throughput for non-interactive, but that gets mixed up with rsync. 
Now on to setting it up in QNAP. Oh the horrors - patching config generation by using sed over /etc/init.d/login.sh to get proper /etc/config/ssh/sshd_config was the easy part. Figuring out that even if your HOME points to /root the /share/homes/admin/.ssh/config gets checked during ssh/scp init was the harder part. Anyways after I've set it everywhere the backup now works fine even for rsync transfers. 

Remember: If your ssh/scp is rate-limited with some operators, make sure to set "IPQoS threshold" in your ssh/config.

Monday, July 28, 2014

One-liners: Deformalize

Recently I needed quick hack for following situation. Given an input like this:
A 1
A 2
B 3
C 4
C 5
C 6
C 7
D 8

I wanted to generate following:
text A -p 1 -p 2
text B -p 3
text C -p 4 -p 5 -p 6 -p 7
text D -p 8

After some thought I have generated it with this awk one-liner.
cat file | awk '{ if (X==$1) { printf " -p "$2 ; } else { printf "\ntext " $1 " -p "$2 ; X=$1 } }'
What would you do? Is there an easier way ?

Friday, February 14, 2014

GZIP/BZIP2 progress indicator

Have you ever needed to figure out how long gzip/bzip compression would take for the file and found it difficult to present it to the user ? Especially for scripts dealing with large files, this would be one of the major usability issues. Well there's a solution for the problem with the use of tool called pipe viewer. You can find more details on how to use it on it's home page, but I'll show you how I've used it to show progress for bzip2 compression of large files.
filename="file_to_bzip2"
size=`stat -L "$filename" -c "%s"`
pv -s $size "$filename" | bzip2 -c > "$filename.bz2"
Generates following output:
# output
2.75GB 0:32:47 [ 1.5MB/s] [===================>     ] 61% ETA 0:20:16

Sunday, May 29, 2011

How to convert multi-page HTML e-book for Kindle

Recently I've got link to the interesting e-book Architecture of Open Source applications. Because I prefer to read books on my Kindle and there was no MOBI version I've decided to prepare it myself. Here is step-by-step guide on how to convert multi-page HTML to format suitable for Kindle on your Ubuntu/Debian/Other linux.

Getting data

First of all we need to get HTML/CSS and image files to local machine. On my machine it's as simple as:
$mkdir ~/aosabook; cd ~/aosabook
$wget -I en,images -nd -r -k -l 3 http://www.aosabook.org/en
We want to download all documents recursively but only from en and images directories, don't create directory structure to local copy and replace paths in html documents so they're locally referenced. For more details check man wget.

Convert to single HTML

Now we have all data downloaded in ~/aosabook and to check whether book is readable we just have to open file:///home/aosabook/ in browser.
Because the structure of the web is multi-page, we have to do additional step. Convert the multi page document into single page. I've used htmldoc utility for this. Run htmldoc and do following:
  • input tab
    • choose Document Type: Book
    • Use add files button to add all html files from ~/aosabook/
    • Select cover image
  • output tab
    • set output to file
    • set output path to ~/aosabook/aosabook.html
    • set output format to html
Play with some other options namely width (set it to 600px for kindle 3) and hit generate.

Convert to Mobi

In order to convert to MOBI format suitable for Kindle, I've just added generated aosabook.html as book to Calibre (you use calibre for your kindle management right?) and clicked on book to convert to Mobi and upload to device.

Saturday, March 15, 2008

dog - better cat than cat

Every *nix user has used "cat" command. It takes one or more files or stdin, concatenates them and puts result into standard output. Common use case is to print out contents of some file to console.
$ cat ~/myfile.txt
Situation is nowadays different from the time cat was born. At that time, users were usualy connected to one mainframe and all was done on local system. Today we use web2.0 applications, which spread across several systems,. We use our computer as terminal for network services. That's why enhanced version of cat, that supports network URL as source and/or target, several string conversions, newline conversion, and more has emerged. The name of this utility is dog (after all, dogs are much more usefull. Don't you think ?).