bash

Digbit – a Bit-shifted automatic domain generation for BitSquatting [python]

BitSquatting is not new but it’s relatively new. This type of attack relies on a random bit that switches from 0 to 1 or from 1 to 0 inside a RAM. Because of this even though you try to access domain like cnn.com, in fact your computer is requesting ann.com. It’s rare but it happens. It can be caused by a cosmic radiation or overheated memory modules. If You would like to learn more I can recommend Artem’s page.

To make it easier to find a domain that is in a single bit-shift distance (as in Hamming distance) I’ve created a script that is generating all the possibilities.

For example lets search for domains (bit-wise) close to cnn.com. The script output will be:

snn.com knn.com gnn.com ann.com bnn.com c.n.com cfn.com cjn.com cln.com con.com cnf.com cnj.com cnl.com cno.com cnn.som cnn.kom cnn.gom cnn.aom cnn.bom cnn.cgm cnn.ckm cnn.cmm cnn.cnm cnn.coe cnn.coi cnn.coo cnn.col

To make it easier to check if particular domain is already registered or not, I’ve made a wrapper script that will execute the python script and for each generated domain it will execute command:

> nslookup domain | grep "NXDOMAIN"

The wrapper script is executed with a single argument that is a domain name. Sample output for twitter.com:

Twitter bitsquatting

Some of those reported as available are obviously false-positive since TLDs like kom don’t really exist. I did not removed them because new TLDs are added from time to time and you might as well have a custom domain setup within your LAN.

The wrapper code:

#!/bin/bash

for i in $( ./digbit.py $1);
do
        dotcheck=$( echo "$i" | grep "\." | wc -l)
        echo -n "$i ";
        check=$(nslookup $i| grep "NXDOMAIN" | wc -l);
        if [[ $check -ne 0 && dotcheck -ne 0 ]];
        then
                echo " < available";
        else
                echo " ";
        fi;
done

The Digbit script code:

#!/usr/bin/env python3
import binascii
import re
import os
import sys

def new_text2bin(mtext):
        result = [ bin(ord(ch))[2:].zfill(8) for ch in mtext ]
        return ''.join(result)

def switch_bit(bit, my_list ):
        result_list = my_list[:]
        if result_list[bit] == '1':
                result_list[bit] = '0'
        else:
                result_list[bit] = '1'
        return result_list

def generate_similar( domain ):
        domains = []
        domain_list = list(domain)
        for i in range(len(domain)):
                print(switch_bit(i, domain_list))
        return domains

def binstr_to_ascii( binstr ):
        binstr = -len(binstr) % 8 * '0' + binstr
        string_blocks = (binstr[i:i+8] for i in range(0, len(binstr), 8))
        string = ''.join(chr(int(char, 2)) for char in string_blocks)
        return string

if len(sys.argv) != 2:
        sys.exit("Not enough args")
else:
        domain=str(sys.argv[1])

        domain_list = list(new_text2bin(domain))

        for i in range(len(domain_list)):
                new_d = (''.join(switch_bit(i, domain_list)))

                new_d_str = binstr_to_ascii(new_d)
                corect_domain = re.match("^(((([a-z0-9]+){1,63}\.)|(([a-z0-9]+(\-)+[a-z0-9]+){1,63}\.))+){1,255}$", new_d_str + "." )

                if ( corect_domain is not None and len(corect_domain.string) > 1 and (corect_domain.string).index(".") < len(corect_domain.string)-1 ):
                        print(new_d_str, end="\n")

        print(" ")

Obrazek

DNS zone transfer script

Notice. I’ve made a web-based version of this script that has more functions and an archive for successful transfers. 

Script automating discovery of name servers allowing zone transfers.

Nothing fancy. Just to make it easier.

The output:

Zone transfer discoveredIf you use the command presented on the bottom of the image above you will get results like this:

Successful zone transfer for example domain


Script:

#!/bin/bash

domains="$1"
data="";

for dnsserver in $(host -t ns "$domains" | cut -d " " -f 4);
do
        # VARIABLES
        str_len=$(echo "$dnsserver" | tr -d " " | wc -c)
        str_len=$(echo "$str_len-2"| bc )
        dns_server=$(echo "$dnsserver" | cut -b "-$str_len")
        zone_data=$(dig axfr "$1" "@$dns_server")

        # CHECKING ZONE TRANSFER
        check=$(echo "$zone_data" | grep "Transfer failed" | wc -l)

        if [[ $check -ne 0 ]];
        then
                echo -e " Transfer \033[31mFAILURE\033[00m at $dns_server"
        else
                echo -e " Transfer \033[32mSUCCESS\033[00m at $dns_server"

                # REMEMBER LAST SUCCESSFUL
                data="$zone_data";
                server="$dns_server"
        fi

done

echo ""
echo " Use command: dig axfr $1 @$server"

# UNCOMMENT THIS IF YOU WANT ZONE DATA OUTPUT
# echo "$data"

Obrazek

4chan board /hr image downloader

[ command line image download manager ]

Two decades ago, browsing the internet via 56kb modem was an agonizing experience when you’ve encountered the webpage rich with pictures. They had to be compressed and of course the compression was lossy.
Now you can download high-resolution pictures with a click of a button and wait only couple of seconds for them to be fully loaded. Bandwidth is not an issue anymore.
What >IS< the issue then? Where to get the really high-resolution pictures (above 1920×1080) on a specific and very narrow topic.
If you like (as do I) old medieval maps, NASA best space pictures, landscape photos or some old paintings that are hard to find in high-resolution, and you will not feel offended with an occasional nudity – then the /hr board at 4chan.com is the place just for you. In there you will find multiple collections of really amazing pictures compiled in a single threads just waiting for you to grab them. Yes – this is 4chan. Famous for being lightly moderated and for postings that are anonymous – as warned before, you might encounter some nudity but i guess this is a price for a top-notch pictures you would have otherwise never found.

The /hr board is a collection of multiple threads containing postings with pictures. While i really like some of them, I’m not a patient person when it comes to the downloading stuff manually by clicking on each and every one of them. Therefore, I’ve created a bash script that will download all the pictures for me automatically. It is fairly easy and it works in a three phases firstly it collects all the links to threads, secondly it parses those threads and isolates the links to images and finally it downloads those images to the directory specified.

While it is capable of downloading at full speed, I’ve limited the parsing of webpages to 10000Bps and downloading the images to 200kbps with curl and wget respectively.
I think it’s a matter of netiquette not to cause an overload for the 4chan servers.

Take a peek at how it looks when executed:

  1. Collecting links to sub-pages:

Collecting links to sub-pages of 4chan2. Collecting links to images:

collecting links to 4chan images3. Downloading images

downloading images form 4chan

 


Functions’ definitions are in separate file below.

Without further ado, here it is:

dwm.sh


#!/bin/bash

 source myfunctions.sh

 ############################
 #        VARIABLES         #
 ############################

 today=`date +%d.%m.%y`
 time=`date +%H:%M:%S`
 cols=$(tput cols)
 lines=$(tput lines)
 download_counter=0;
 curltransfer="100000B"

 margin_offset=0;
 margin_text_offset=2;

 let top_box_t=0;
 let top_box_b=0+5;
 let top_box_l=0+$margin_offset;
 let top_box_r=$cols-$margin_offset;
 let top_box_width=$cols-$margin_offset-$margin_offset

 site="http://boards.4chan.org/hr/"

 if [[ ! -d "./$today" ]]; then  mkdir ./$today; fi

 tput civis
 clear

 draw_top_box;
 draw_bottom_box;
 scrap_links;

 cat links.4chan | sort -u >> uniquelinks.4chan
 if [[ -e links.4chan ]]; then rm links.4chan; fi
 cat uniquelinks.4chan | grep -v "board" | cut -d "#" -f 1 > tmp.4chan
 rm uniquelinks.4chan
 cat tmp.4chan | sort -u >> uniquelinks.4chan
 rm tmp.4chan

 scrap_images;

 cat images.4chan | sort -u >> uniqueimages.4chan
 rm images.4chan

 draw_panel;
 draw_headers;

 download_images;

 tput cup 20 0;

 tput cnorm

And required functions file:
myfunctions.sh

#!/bin/bash

check_image()
{
        echo "1:" $1
        echo "2:" $2
}

draw_headers()
{
        tput cup $(($top_box_t+2)) $(($top_box_l+2));
        echo -en "$EINS\033[1;30m\033[40m#\033[0m";
        tput cup $(($top_box_t+2)) $(($top_box_l+6));
        echo -en "$EINS\033[1;30m\033[40mFILE NAME\033[0m";
}

download_images()
{

        let scroll_lines=$lines-6
        top=4

        tput cup $top 0
        scrolled_lines=1;
        allfiles=`cat uniqueimages.4chan | wc -l`

        index=0

        for i in `cat uniqueimages.4chan`
        do

                filename=`echo $i | cut -d "/" -f 6`
                if [[ $((index%$scroll_lines)) -eq 0 ]];
                then
                        tput cup $top 0
                        for ((j=0; j<$scroll_lines; j++))
                        do
                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
                        done
                        tput cup $top 0
                fi

                echo -ne "\033[s"

#               if [[ $index -gt 999 ]];
#               then
#                       tput cup $top 0
#                        for ((j=0; j<$scroll_lines; j++))
#                        do
#                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
#                        done
#                        tput cup $top 0
#                       let index=1
#                       echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
#               fi
                if [[ $index -lt 10 ]];
                then
                        echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -lt 100 && $index -gt 9 ]];
                then
                        echo -e "  $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -lt 1000 && $index -gt 99 ]];
                then
                        echo -e " $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -gt 999 ]];
                then
                        tput cup $top 0
                        for ((j=0; j<$scroll_lines; j++))
                        do
                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
                        done
                        tput cup $top 0
                        echo -ne "\033[s"
                        let index=1
                        echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                fi

                #DOWNLOADING HERE
                color=1
                size=0
                download_check=`cat ./4chan_download.log | grep $filename | wc -l`
                if [[ $download_check -eq 0 ]];
                then
                        let color=1
                        wget -q --limit-rate=200k -P ./$today http:$i
                        size=`ls -hls 24.09.12/$filename | cut -d " " -f 6`
                        #ls -hls 24.09.12/$filename | cut -d " " -f 5
                        let download_counter=$download_counter+1
                else
                        let color=2
                fi

                echo -ne "\033[u"
                if [[ $index -lt 10 ]];
                then
                        echo -en "   $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                elif [[ $index -lt 100 && $index -gt 9 ]];
                then
                        echo -en "  $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                elif [[ $index -lt 1000 && $index -gt 99 ]];
                then
                        echo -en " $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                fi

                let index=$index+1

                echo -ne "\033[s";
                #draw_bottom_box;
                tput cup $top_box_t $(($top_box_l+20));
                echo -en "$EINS\033[30m\033[47mDOWNLOADED $download_counter/$allfiles\033[0m";
                echo -ne "\033[u";
        done
}

scrap_images()
{
 tput cup $(($top_box_t+5)) $(($top_box_l+1));
 echo -en "$EINS\033[1;30m\033[40mSCRAPING IMAGES\033[0m";
 tput cup $(($top_box_t+5)) $(($top_box_l+20));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 tput cup $(($top_box_t+5)) $(($top_box_l+36));
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

 urls=`cat uniquelinks.4chan | wc -l`
 index=0;

 position=21
 for i in `cat uniquelinks.4chan`;
 do

        let index=$index+1
        tput cup $top_box_t $(($top_box_l+20));
        echo -en "$EINS\033[30m\033[47mSCRAPED $index/$urls\033[0m";

        #HERE GOES THE CODE FOR images/# SCRAPPING
        let left=$position
        tput cup $(($top_box_t+5)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";

        curl -s --limit-rate 10000B http://boards.4chan.org/hr/$i | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e  '/^$/ d' | grep "images" | uniq >> images.4chan

        let position=$position+1
        if [[ $position -eq 36 ]];
        then
                tput cup $(($top_box_t+5)) $(($top_box_l+36));
                echo -en "$EINS\033[1;30m\033[40m]\033[0m";
                let position=21;
                tput cup $(($top_box_t+5)) $((1+$position));
                echo -en "$EINS\033[1;30m\033[40m              \033[0m";
        fi

 done

#CLEAN PROGRESS BAR
 for i in {1..14};
 do
        let left=$((19+$i))
        tput cup $(($top_box_t+5)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m \033[0m";
 done

#MARK AS COMPLETE
 tput cup $(($top_box_t+5)) $(($top_box_l+20+14));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 echo -en "$EINS\033[32m\033[40m+\033[0m";
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

#CLEAN COUNTER
 tput cup $top_box_t $(($top_box_l+20));
 echo -en "$EINS\033[30m\033[47m                                \033[0m";

}

scrap_links()
{
 if [[ -e links.4chan ]];
 then
        rm links.4chan;
 fi
 tput cup $(($top_box_t+4)) $(($top_box_l+1));
 echo -en "$EINS\033[1;30m\033[40mSCRAPING LINKS\033[0m";
 tput cup $(($top_box_t+4)) $(($top_box_l+20));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 tput cup $(($top_box_t+4)) $(($top_box_l+36));
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

#CLEAN OUTPUT FILE
        if [[ -e links.4chan ]]; then rm links.4chan; fi

#SCRAP THE FIST PAGE
 curl -s  --limit-rate $curltransfer http://boards.4chan.org/hr/ | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e '/^$/ d' | grep "res/" | sort -u >> links.4chan

#SCRAP REST
 for i in {1..15};
 do
        let left=$((20+$i))
        tput cup $(($top_box_t+4)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
        curl -s  --limit-rate $curltransfer http://boards.4chan.org/hr/$i | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e '/^$/ d' | grep "res/" | sort -u  >> links.4chan
 done

#CLEAN PROGRESS BAR
 for i in {1..14};
 do
        let left=$((19+$i))
        tput cup $(($top_box_t+4)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m \033[0m";
 done

#MARK AS COMPLETE
 tput cup $(($top_box_t+4)) $(($top_box_l+20+14));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 echo -en "$EINS\033[32m\033[40m+\033[0m";
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";
}

function draw_top_box()
{
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $top_box_t $left;
        echo -en "$EINS\033[30m\033[47m \033[0m";
 done

 tput cup $top_box_t $(($top_box_l+2));
 echo -en "$EINS\033[30m\033[47mDWM v.1.0\033[0m";
 tput cup $top_box_t $(($cols-20));
 echo -en "$EINS\033[30m\033[47m$time | $today\033[0m";

 tput cup $lines $cups;

}

function draw_bottom_box()
{
 tput cup $lines $cups;
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $cols $left;
        echo -en "$EINS\033[30m\033[47m \033[0m";
 done

 tput cup $lines $cups;
 echo -en "$EINS\033[30m\033[47m  DOWNLOADED FILES: $download_counter\033[0m";
}

function draw_panel()
{
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $(($top_box_t+1)) $left;
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
 done

 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $(($top_box_t+3)) $left;
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
 done

 tput cup $(($top_box_t+2)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";
 tput cup $(($top_box_t+2)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";
 tput cup $(($top_box_t+2)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";

 tput cup $(($top_box_t+1)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";
 tput cup $(($top_box_t+1)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";
 tput cup $(($top_box_t+3)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+1)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+1)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+2)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";

}

Be aware that all scripts are run at your own risk and while every script has been written with the intention of minimising the potential for unintended consequences, the owners, hosting providers and contributors cannot be held responsible for any misuse or script problems.

Obrazek

Apache access log parser

[with reverse DNS check and colors]

Nothing special here really. Just a few lines of code to make the logs review a little bit easier.

Displayed columns in order from left to right:

  • Date and time of access
  • HTTP CODE of response [200 in green, 404 in blue, rest in red]
  • IP address
  • Reverse DNS hostname [last 30 chars] [empty if NXDOMAIN]
  • Request [first 30 chars]

 


The output:

apache log parsing output


Script:


#!/bin/bash

while read line
do
        # IP
          ip=$(echo $line | cut -d " " -f 1)

        # HOST
          host=$(host $ip | cut -d " " -f 5 | tail -1)
          if [[ ${#ip} -lt 15 ]]; then
                for (( i=$(echo "15-${#ip}"|bc); i>0; i-- )) do
                        ip="$ip "
                done
          fi

        # IF I DO NOT GET DOMAIN NAME
          if [[ $(echo "$host" | grep "NXDOMAIN" | wc -l ) -ne 0 ]]; then
                host=" - "
          fi

        # EVEN UP THE HOSTNAME TO SEE LAST 30 CHARS
          if [[ ${#host} -lt 30 ]]; then
                for (( i=$(echo "30-${#host}"|bc); i>0; i-- )) do
                        host="$host "
                done
          else
                host=${host:$(echo "${#host}-30"|bc)}
          fi

          dhost="\033[01;30m$host\033[00m"

        #   DISPLAY GOOGLEBOT CUSTOM DNS
          if [[ $(echo $host | grep google |wc -l) -eq 1 ]]; then
                dhost="\033[01;30mGOOGLEBOT\033[00m                     "
          fi

        # DATE
          date=$(echo $line | cut -d "[" -f 2 | cut -d "]" -f 1 | cut -d "+" -f 1)
                day=$(echo $date | cut -d ":" -f 1 | tr -d " ")
                dtime=$(echo $date | cut -d ":" -f 2- | tr -d " ")

        # REQUEST
          req=$(echo $line | cut -d "]" -f 2 | cut -d "\"" -f 2 | cut -d " " -f -2)
        # CUT REQUEST TO 30 CHARS
          dreq=${req:0:30}
        # CUSTOM REQUEST INFO IN CASE OF ADMIN PANEL
          if [[ $(echo $req | grep "admin.php" | wc -l) -eq 1 ]]; then
                dreq="\033[01;31mFAV\033[00m"
          fi

        # HTTP CODE
          code=$(echo $line | cut -d "\"" -f 3 | cut -d " " -f 2)
          hcode="\033[01;31m$code\033[00m";
          if [[ "$code" -eq "200" ]]; then
                hcode="\033[01;32m$code\033[00m";
          fi
          if [[ "$code" -eq "404" ]]; then
                hcode="\033[01;34m$code\033[00m";
          fi

        # DISPLAY
          # I DONT WANT TO DISPLAY FAVICON REQUESTS
          if [[ $(echo $req | grep "favicon.ico" | wc -l) -eq 1 ]]; then
                echo -n ""
          else
                echo -e "$day $dtime $hcode $ip $dhost $dreq"
          fi
done < /var/log/apache2/access.log

Obrazek

cpu usage graph in console


License and source code is available at google code.


Just a simple script that displays processor usage graph in command line.

This is how it will look like when running:
CPU graph in console. …or longer version from more real-like usage:

CPU graph with more real like usage

There are two thresholds defined – above 40 % will be shown in yellow and above 60 % will be red.


Requirements (the ones that i can think of):

  • sleep (sleep for a fraction of second is not supported by older versions of sleep)

  • mpstat

Example function used to draw in console:

function green()
  {
        echo -en "\033[s"
        tput cup $1 $2
        echo -en "\033[1;32m|\033[0m"
        echo -en "\033[u"
  }

33[s saves the current position of the cursor and 33[u restores it to saved position.
tput cup places the cursor in specific row and column.


Full script:

#!/bin/bash
 
# VARIABLES
  columns=40
 
# FUNCTIONS #
function draw()
{
        echo -en "\033[s"
        tput cup $1 $2
        echo -en "\033[1;3$3m$4\033[0m"
        echo -en "\033[u"
}
function wipe()
{
        echo -en "\033[s"
        tput cup $1 $2
        echo -en "\033[1;31m \033[0m"
        echo -en "\033[u"
}
#############
 
# HIDE CURSOR AND CLEAR SCREEN #
  tput civis & clear
 
while true; do
 
# COLLECT UPDATE
#
 
        idle=`mpstat 1 1 | grep "Average" | tail -1 | sed 's/ \+/ /g' | cut -d " " -f 11 | tr -d"\n"`;
 
        usage=`echo "scale=0;(100-$idle)/10" |bc`;
 
        # BACKUP PREVIOUS DATA #
        # CUT THE OLDEST LINE  #
        cat cpu.log | tail -$columns >> temp.tmp
 
        # UPDATE STATS #
        if [[ $usage -eq 0 ]];
        then
                # IF USAGE LESS THEN 20% DRAW SINGLE BAR #
                echo "1" >> temp.tmp
        else
                echo $usage >> temp.tmp
        fi
 
        # UPDATE LOG #
        cat temp.tmp > cpu.log
        rm temp.tmp
 
        # DRAW GRAPH #
 
        var=6;
        # BEGIN FROM COLUMN 1 #
        j=1
 
        while read usage
        do
                # DRAW USAGE #
                for i in `seq 1 $usage`
                do
                        top=`echo "11-$usage" | bc`
                        var=`echo "11-$i"     | bc`
 
                        if [[ $usage -gt 3 ]]; then
                                if [[ $usage -gt 6 ]]; then
                                        if [[ $var -eq $top ]];then
                                                draw    $var $j "1" "+"
                                        else
                                                draw    $var $j "1" "|"
                                        fi
                                else
                                        if [[ $var -eq $top ]];then
                                                draw    $var $j "3" "+"
                                        else
                                                draw    $var $j "3" "|"
                                        fi
                                fi
                        else
                                if [[ $var -eq $top ]]; then
                                        draw     $var $j "2" "+"
                                else
                                        draw     $var $j "2" "|"
                                fi
                        fi
                done
 
                # WIPE PREVIOUS BAR REMAINNING IF THEY EXIST #
                usage=`echo "$usage+1" | bc`
 
                for k in `seq $usage 11`;
                do
                        var=`echo "11-$k" | bc`
                        wipe $var $j
                done
 
                # PROCEED TO NEXT COLUMN #
                j=`echo "$j+1" | bc`
 
                # ADD LATENCY IF NEEDED #
                sleep 0.1
 
        done < "cpu.log"
        tput cup 11 0
#
done